Saltar al contenido principal

Lamentablemente no somos totalmente compatibles con su navegador. Si tiene la opción, actualice a una versión más reciente o utilice Mozilla Firefox, Microsoft Edge, Google Chrome o Safari 14 o posterior. Si no puede y necesita ayuda, envíenos sus comentarios.

Agradeceríamos sus comentarios sobre esta nueva experiencia.Díganos qué piensa se abre en una nueva pestaña/ventana

Elsevier
Publique con nosotros

Selecting the right generative AI tool for your institution

Two Librarians brainstorming

About this guide

Generative artificial intelligence (GenAI) tools are growing rapidly in number and popularity. But while many in the academic community are excited about their potential, there are also reservations.

These range from ethical questions over copyright and privacy to concerns around the technology and sources used. With a lack of consensus on what a “good” tool looks like – and available frameworks to assess them – identifying a suitable solution for your institution remains a challenge.

In this high-level guide, we look at:

  • Why it’s important to think about which GenAI tool your institution is using

  • The role librarians can play in identifying a trusted solution (and why you should)

  • Resources to increase your AI literacy

  • 15 key questions to ask when evaluating GenAI tools

Download this guide as PDF se abre en una nueva pestaña/ventana

A librarian helping a student

What is a GenAI tool?

GenAI is a form of deep machine learning. Large language models (LLMs) consume vast quantities of existing content and learn to identify underlying structures and patterns within it. When prompted, the LLMs then draw on that knowledge to generate new outputs with similar characteristics.

Why selecting the right AI tool matters for your institution

The term ‘responsible AI’ traditionally refers to the safe and ethical design and deployment of AI tools. Increasingly, however, this definition is expanding to include the responsible selection of AI tools.

For example, if there’s no transparency around how a tool operates and the rules that guide it, how can you determine what steps it takes to minimize bias or hallucinations (incorrect or irrelevant answers)? Similarly, if it’s unclear which content sources it uses, how can you judge the recency or crucially, trustworthiness of the information it provides?

Did you know?

A 2024 report on AI by The Chronicle of Higher Education flags another important point to consider – that ownership of some AI tools may well change hands. It notes: “A common assumption in 2024 is that many of the gen-AI startups that emerged after ChatGPT’s release will likely fail, merge, or be acquired in the near future.”2

Your users need access to accurate and reliable tools to guide their research, teaching and learning. When tools don’t meet these standards, the quality of your institution’s academic research and student data literacy can suffer. Unreliable GenAI results also have consequences for your workload; some librarians are now fielding a growing number of requests to validate suspect AI-generated references.

“Trust is a useful tool for us as humans … [but] it can be really problematic when our expectations of what a system is capable of are misaligned with reality.”

Photo of Harry Muncey, PhD

HM

Harry Muncey, PhD

Senior Director of Data Science and Responsible AI en Elsevier

Leveraging your knowledge and skills as a librarian

With your expertise in the curation and evaluation of digital resources, you can help your institution and users make informed decisions about which tools to use.

A librarian helping a student

However, there is evidence that library interest in GenAI remains relatively low – at least in some regions. In 2023, a survey of North American members of the Association of Research Libraries (ARL) found that only 11% of respondents said they were actively implementing GenAI solutions.3 And 70% of participants in another survey admitted they didn’t feel prepared enough to adopt GenAI tools within the coming 12 months.4

Taking a ‘wait and see’ approach can be risky given the already high uptake of AI tools by faculty and students. In fact, survey results suggest that around 60% of your library’s users are likely to be independently using GenAI tools.5 And this has huge potential to grow. For example, the Elsevier study Insights 2024: Attitudes toward AI, found that if researchers had access to a reliable and secure AI assistant, 92% would use it to “review prior studies, identify gaps in knowledge and generate new research hypotheses for testing”.6

By hosting a central solution in the library, you can help your users enjoy the benefits of GenAI while avoiding the potential pitfalls. You can also help to demonstrate the value you add on campus: According to the ARL, the rise of GenAI offers librarians an opportunity to learn about the technology and use that knowledge to “exert leadership as (their) research institutions navigate the AI era”.3

And it seems clear that library users want this leadership. For example, 68% of US higher education instructors surveyed said they would consider using a GenAI tool if there was an assurance that it was going to be effective. For 54%, guidance on its reliability was the most important factor.7 And a rising number of ARL members report being asked by other departments on campus to partner with them on AI and create suitable policies.8 The has led the organization to draw up seven principles librarians can use when responding to these requests.

“These are not magic black boxes. This is a transactional relationship and people need to trust and maintain confidentiality.”

Andrew Hufton

AH

Andrew Hufton

Editor-in-Chief of the Cell Press journal Patterns

The importance of AI literacy

Not all AI tools are created equal. The solutions that you and your stakeholders must choose from differ in their maturity, functionality and scope. Crucially, they also differ in the reliability of the information they generate.

But before you can evaluate AI, it’s important to become ‘AI literate’. While definitions of the term vary, most agree that it involves familiarizing yourself with fundamental AI concepts like machine learning, natural language processing and neural networks. It also involves developing an understanding of the technology’s opportunities and limitations.

For those new to GenAI, here are a few useful resources to help you get started:

15 questions to ask when evaluating a GenAI tool

Sifting through the rising number of GenAI tools available isn’t easy. Here are some questions that can help you determine which route to go.

Frequently asked questions

Did you know?

GenAI holds great promise within higher education and research if appropriately applied.

With proper guidance from experts in information management, students, their teachers and researchers can tackle tasks that were once thought insurmountable. Through understanding, due diligence and outreach, librarians are perfectly placed strategically to serve as these experts for their users and institution

Download this guide as PDF  se abre en una nueva pestaña/ventana

References

1 Chatbots May ‘Hallucinate’ More Often Than Many Realize, New York Times, November 2023: https:// www.nytimes.com/2023/11/06/technology/chatbots-hallucination-rates.html  se abre en una nueva pestaña/ventana

2 Swaak, T. Adapting to AI: How to understand, prepare for, and innovate in a changing landscape. The Chronicle of Higher Education. 2024. https://store-chronicle-com.ezproxyberklee.flo.org/products/adapting-to-ai se abre en una nueva pestaña/ventana

3 Lo, L.S. & Hudson, C. Quick Poll Results: ARL Member Representatives on Generative AI in Libraries. Association of Research Libraries. Last update May 9, 2023. https://www.arl.org/blog/quick-poll-results-arl-member-representatives-on-generative-ai-in-libraries/  se abre en una nueva pestaña/ventana

4 Lo, L. S. Evaluating AI Literacy in Academic Libraries: A Survey Study with a Focus on U.S. Employees. Academic Department Resources at UNM Digital Repository. 2024. https://digitalrepository.unm.edu/ulls_fsp/203  se abre en una nueva pestaña/ventana

5 Stansbury, J.A., Lausch, S., Zahadat, N. & Kelly, D. White Paper: AI Perceptions at the University of Baltimore. 2023. https://drive.google.com/file/d/1ufdagea0Xm8TpiKsyvbr1Kp-kpez3z6Z/view se abre en una nueva pestaña/ventana | Freeman, J. Provide or punish? Students’ views on generative AI in higher education. HEPI. February 1, 2024. https://www.hepi.ac.uk/2024/02/01/new-hepi-policy-note-finds-more-than-half-of-students-have-usedgenerative-ai-for-help-on-assessments-but-only-5-likely-to-be-using-ai-to-cheat/  se abre en una nueva pestaña/ventana

6 Insights 2024: Attitudes toward AI. Elsevier. 2024. https://www-elsevier-com.ezproxyberklee.flo.org/insights/attitudes-toward-ai

7 Apprehension of Generative AI in Higher Education Overstated, Cengage Survey Finds. Cengage Group. August 28, 2023. https://www.cengagegroup.com/news/perspectives/2023/higher-ed-gen-ai-facultyresearch-findings/  se abre en una nueva pestaña/ventana

8 Coffey L. New AI Guidelines Aim to Help Research Libraries. Inside Higher Education. May 2024.

9 Miao, F. & Holmes, W. Guidance for generative AI in education and research. UNESCO. 2023.