Top

Evolution: Librarians lead conversation on Generative Artificial Intelligence

September 20, 2024

Early adopters experiment, creating models that propel trends. Not unlike innovators in how quickly they explore new technologies and ideas, librarians also bring deep knowledge of information management, healthy skepticism, awareness of risks and analytical approaches to exploring the new.

Today, the new is generative artificial intelligence (GenAI). 

Where’s it going? What does it mean in an academic setting? What are its potential benefits and risks? What are the copyright, ethical and privacy concerns surrounding this new technology. At VCU, VCU Libraries is playing an active role in answering those and related questions and taking a leading role in the conversation about the applications of GenAI in the learning and research setting. 

VCU Libraries’ key voices in the conversation about teaching, learning and creating with GenAI  are Online Teaching and Learning Librarian Hope Kelly and Multimedia Teaching and Learning Librarian Oscar Keyes. 

Librarians are a logical go-to source to study and question applications of GenAI. “Because AI tools are transforming the way we search and synthesize information, I think it makes a lot of sense that librarians are one of the first points of contact,” Keyes said. “Similar to how Google Search was a paradigm shift for the way research was conducted, I think tools like ChatGPT and Copilot present similar challenges.” According to Keyes, questions raised include:

  • How do you evaluate and verify the source of information?
  • Can I cite this source? 
  • Can I trust this information? 
  • How do you avoid confirmation bias? 
  • How do you address implicit bias in the algorithm?

Continued Keyes, “These questions are nothing new, even if the technology is. I also think librarians are on the front lines of addressing issues related to misinformation, disinformation, and malinformation, which not only clogs searches with false results but also has now been ingrained in many of these generative AI tools trained on the open web.” 

From a faculty perspective, questions about AI abound. Keyes sees three primary arenas of concern:

  1. “For faculty who assign papers, there is a lot of concern about academic integrity and rigor in their classes, that students are offloading the development of their critical thinking skills to machines. Additionally, many faculty feel frustrated about the lack of guidance about how to assess or even assign papers in an era of AI. These tools certainly present a paradigm shift and it’s possible even our concept of plagiarism might need to evolve to better address issues related to academic integrity.”
  2. “In multimedia, synthetic images, audio, and video raise concerns about misinformation and disinformation in the form of deep fakes. I think this is the area that I have the most concerns about because that’s the area that I focus the most on in my work. It has been difficult enough to find ways to teach students how to verify information in their social media feeds when the media objects were actually real but being presented out of context or as if they come from totally different places or times. Now, it’s possible to fake the voices of world leaders and videos of war zones, and I think that could be a recipe for disaster. Especially in a polarized media environment like the one we find online today.”
  3. “A concern shared across all types of generative AI tools, are ethical issues surrounding the way these AI tools are trained. Many of them scrape the internet without the consent of creators and many writers and artists are fighting back through legal battles. Some arts faculty worry that this will lead to further exploitation of artists while other faculty think these tools represent accessible creative tools. It’s a complicated topic and without clear guidelines or government regulations, there’s a lot of validity to those concerns.”

Kelly found herself dealing with these faculty concerns in 2023. “The big issue was that faculty were unable to find works cited in their students’ papers. Students were using fabricated or ‘hallucinated’ citations that were created by ChatGPT. So, we had to confirm that these articles did not actually exist and explain why these legitimate looking, but ultimately fake citations, were popping up. Then we had bigger issues to contend with like: How does a student cite or attribute these types of applications in their work or is it ethical to use them for different parts of the research or publication process?”

Kelly and Keyes convened a workgroup of librarians to publish a research guide that provides information for VCU students, faculty and staff on the topic of generative artificial intelligence tools so that they may assess practical and ethical issues. The research guide is widely referenced and used on campus and beyond and predated university-level guidance on AI. Authors included Research and Education Librarian Roy Brown, the Health Sciences Library Intern Alanna Natanson and Hillary Miller, scholarly communications librarian. 

“We’ve come a long way,” says Kelly. “When we initially developed our research guide on the topic, which came in response to so many queries coming into the libraries, there was not much university-level guidance as there was a rapid and unanticipated uptake of these tools. Now, we have university information available including a landing page that provides a primer on genAI and guidelines from our CIO. Additionally, we now have an approved tool, CoPilot, to use when relevant to our work.”

Also at VCU, a new minor in Artificial Intelligence and Immersive Reality is offered through the Interdisciplinary Studies Program program at University College. Additionally, the AI Futures Lab is at the Humanities Research Center. Finally, there is a Faculty Advisory Committee for Gen AI & Teaching & Learning which puts together programs and resources addressing AI tools in the classroom. Keyes and Kelly are connected through committee work, conversations or consultation with all these projects at VCU.

Kelly is also contributing to national library scholarship on the topic. Her 2022 study on AI in libraries provides insights from and to librarians nationwide. In her survey of library professionals, 65 percent of respondents agreed or strongly agreed that providing librarian-led instruction with ChatGPT is a good idea, but only 43 percent thought that independent study with ChatGPT is a good idea. An even greater majority—72 percent—agreed or strongly agreed that they intend to use ChatGPT in instruction, while 67 percent intended to use it in other areas of work, and 79 percent intend to use it in the future.

An overflow crowd was ready to learn more about attitudes among librarians at a 2024 American Library Association conference. Kelly’s research partner from Florida International University, Melissa Del Castillo, and she had a standing room only turnout for their talk at the ALA LibLearnX conference in Baltimore. Their presentation resulted in an article in American Libraries Magazine, reporting in a LibLearnX recap from Library Journal, and an AI specific recap. In further service to the profession, an article based on that study is scheduled for a 2025 article in College & Research Libraries. 

While Keyes and Kelly both researched and expanded their learning and understanding of AI, the root of their work is in teaching–students, faculty and librarian colleagues–and there was plenty of that. 

Keyes and Kelly participated in several panels on generative AI through VCU’S Center for Teaching and Learning Excellence, which focuses on faculty development. These sessions were part of the Faculty Advisory Committee for the Provost’s Office. They partnered with the VCU Writing Center to develop two undergraduate-focused workshops on AI and 

also helped to plan the Generative AI in Teaching & Learning conference for VCU Libraries employees. The line-up of Workshops @ The Workshop in 2023-2024 included six AI-related sessions, two of each focusing on generative AI in graphic narrative, poetry and music. In addition, Keyes presented at conferences of the Alliance for the Arts in Research Universities and the National Art Education Association and delivered workshop sessions for Richmond Public Schools and Carnegie Mellon University. 

What’s next? Kelly and Keyes agree that AI tools do have the potential to make some parts of our lives easier, especially mundane or repetitive tasks such as designing templates, copyediting or writing formulas for spreadsheets. But there are concerns that will be ongoing in the conversation about AI. “I’m really worried about how these tools are going to be integrated into the classroom and what those priorities will be,” says Keyes. “I think it’s important that we’re critical and reflective on any tool we introduce into our classrooms, and I worry that we’re not focusing enough on issues like analyzing deep fakes but sometimes seem to be more worried about students cheating or not. Ultimately, I think we’re going to have to teach with/about these tools in order to engage our students to critically think with/about them.” 

< Previous  Next >