More or Less

Did AI researchers let AI hallucinations into scientific papers?

46 snips
Feb 21, 2026
Alex Tui, CTO and co-founder of GPTZero, who investigates AI-generated text and fabricated citations. He explains how AI hallucinations can create fake references in top conference papers. The conversation covers how these were detected, why they matter for trust and reproducibility, and surprising patterns like biased or oddly named fake authors.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

AI Hallucinations Are Confident Fabrications

  • Large language models often produce fluent but fabricated facts known as hallucinations.
  • These hallucinations matter only when users accept them without verification.
ANECDOTE

Researchers Using AI To Draft Citations

  • Alex Tui describes researchers asking AI to write paper sections and add many citations.
  • The AI then invents plausible-looking references because it lacks real references to cite.
INSIGHT

Systematic Scan Found 100+ Fake Citations

  • GPT‑Zero ran a hallucination detector across 5,000 accepted NeurIPS papers' references.
  • They verified citations and found at least 100 clear hallucinations across about 50 papers.
Get the Snipd Podcast app to discover more snips from this episode
Get the app