Artificial Intelligence Bot

AI Sources & Misinformation Risks

AI is an incredible tool, but let’s be clear: not everything it says is accurate. If you’ve ever asked ChatGPT or another chatbot for a citation, you may have noticed something unsettling—it confidently generates references that don’t exist or misattributes sources. This is known as an AI hallucination, and for evidence-based nutrition practitioners, it’s a serious concern.

When AI Citations Seemed Like Magic

When I first discovered Perplexity AI, I was amazed that it cited sources—it felt like a breakthrough! Finally, AI could provide references, making it feel more trustworthy. I was even more excited when ChatGPT and Claude.ai introduced similar features. But as I dug deeper, I realized a major issue: many AI-generated citations are inaccurate or completely fabricated. What seemed like a major advancement turned out to be another layer of complexity in using AI responsibly.

Why AI Citations Can’t Always Be Trusted

AI models, including ChatGPT and Perplexity AI, generally do not have direct access to PubMed, Cochrane Library, or other scientific databases—unless specific plugins or integrations are used. However, even when AI tools claim to pull from PubMed, it’s crucial to verify citations, as they may be outdated, fabricated, or incorrectly sourced.

Here’s what often happens:

  • AI generates a citation that sounds real but isn’t actually a published study.
  • It mixes elements of multiple sources into one incorrect reference.
  • It provides links to papers that don’t exist in the named journal.

For a nutritionist relying on peer-reviewed research, this can be dangerous if not caught.

How to Spot & Fix AI Citation Errors

  • Ask AI for direct sources – Instead of trusting its first response, follow up with: “Where exactly did you get this information?”
  • Verify in trusted databases – Cross-check citations in PubMed, Cochrane Library, IFM, Cleveland Clinic, or other sources you trust before using them in any professional work.
  • You are the most important authority – AI can assist, but it should never replace your own expertise in evaluating evidence.
  • Direct AI to use specific sources – Instruct it by saying, “Only summarize findings from PubMed or Cochrane Library.” (More to come.)

The Takeaway

AI is powerful, but it requires human oversight—especially in fields like nutrition, where accuracy and credibility matter. Instead of trusting AI blindly, develop a system to fact-check and refine responses. More to come on how to give AI better instructions so you can get more accurate and useful responses. Stay tuned!

Anne Stephenson, MS CNS


Nutrishify Founder • Simplifying success for nutrition pros with smarter tools and actionable resources.