AI Doesn’t Have a Hallucination Problem. It Has a Provenance Problem.
AI doesn’t fail because it “hallucinates”—it fails because it’s being fed the wrong documents. A new Tow Center study shows that even top AI search tools misidentify sources more than 60% of the time, and the problem becomes far worse inside enterprises with fragmented, duplicate, and outdated content. The solution isn’t bigger models—it’s stronger provenance and a true document-of-record system.