Humans routinely misremember facts but are relatively certain those remembrances are correct.
That’s a form of minor, everyday hallucination.
If you engage in such thorough criticism and checking of every recalled fact as to eliminate that, you’ll crush your ability to synthesize or compose new content.
In a human, there is a distinction between "this is information I truly think I know, my intention is to state a true fact about the world" and "this is something I don't know so I made something up". That distinction doesn't exist in LLMs. The fact that humans can be mistaken is a completely different issue.
That’s a form of minor, everyday hallucination.
If you engage in such thorough criticism and checking of every recalled fact as to eliminate that, you’ll crush your ability to synthesize or compose new content.