Answered By: Lisa Adriani
Last Updated: Sep 17, 2025     Views: 10

ChatGPT and other generative AI applications are notorious for inventing information. These inventions are called "hallucinations." Hallucinations usually happen when the AI can't find the specific information you’ve asked for. Instead of telling you it can't find something, the AI tool will just make something up. It does so with total confidence which means it can be tricky to tell real information from fake.

It is important for anyone incorporating AI into their research process to verify the information and sources they are being given.   

You should locate and access the full text of the citations you find and read them. For scientific or medical claims, you should consult the original research study. For statistics and data, read the original publication. This is the simplest way to ensure your citation and information is real. 

For more details on checking AI generated citations and claims visit the library's guide on Resource Evaluation. 

Keep in mind that generative AI tools like ChatGPT or CoPilot can do amazing things, but they are not search engines. They are text generators that are programed to give an answer. Their datasets don’t always use current or correct information. Be critical of the information that the tools give you. For example, ChatGPT uses much of reddit as part of its training set - not the most impartial or accurate source on the internet!

Submit a Question

Your email is safe with us - we will not share it, but we need it to answer you!
Your Question
Your Info
Fields marked with * are required.