🔮The Codex
Hallucination
When an AI generates false or fabricated information that sounds convincing.
📖 Apprentice Explanation
Sometimes AI makes things up and presents them as facts. This is called hallucination. Always double-check important information from AI, especially dates, statistics, and citations.
🧙 Archmage Notes
Hallucinations arise from the probabilistic nature of language models. Mitigation strategies include RAG (Retrieval-Augmented Generation), grounding, constrained decoding, and chain-of-verification approaches.
