Skip to Main Content
London Metropolitan University

Artificial Intelligence (AI) Library Guide: Checking Outputs

Checking Outputs from AI Tools

One of the current features of text-based generative AI tools is a failure to cite or reference the sources of information used in their responses. Or if the AI tool does provide references, there is often a failure to do so accurately. AI models rely on statistical patterns based on what likely word comes next (predictive text) and don't have a real understanding of how a citation or a reference is presented. This can lead to inaccuracies.

Always check out the information provided by the AI tool's creators to find out the tool's latest capabilities before using it, and whether there is an added capacity to cite and reference sources. 

Common citation and referencing problems

  • Not providing DOIs or URLs, so the reader is unable follow a link through to the original sources.
  • Providing incorrect authors for a source.
  • Confusing resource types, for example, mixing up book chapters with journal articles.
  • Places of publication and dates may be incorrect or missing.
  • 'Hallucinated' references - i.e., the references are completely made up!
  • And so on.

So what to do?

  • Make sure that you check and verify the citations and references provided by the generative AI tool you're using.
  • Treat any information without a reference with deep scepticism.
  • It is always advisable to keep screenshots of AI output as the information is not replicable.