What is hallucination in generative AI?
A hallucination is a confident-looking model output that is unsupported, fabricated, or wrong.
- Looks fluent but is false
- RAG can reduce it but not eliminate it
- Verification still matters
What is hallucination in generative AI?