cs.AI, cs.CL

Where Fake Citations Are Made: Tracing Field-Level Hallucination to Specific Neurons in LLMs

arXiv:2604.18880v1 Announce Type: cross
Abstract: LLMs frequently generate fictitious yet convincing citations, often expressing high confidence even when the underlying reference is wrong. We study this failure across 9 models and 108{,}000 generated…