cs.AI, cs.LG, stat.ML

The Structural Origin of Attention Sink: Variance Discrepancy, Super Neurons, and Dimension Disparity

arXiv:2605.06611v1 Announce Type: cross
Abstract: Despite the prevalence of the attention sink phenomenon in Large Language Models (LLMs), where initial tokens disproportionately monopolize attention scores, its structural origins remain elusive. This…