The Diffusion-Attention Connection
arXiv:2604.09560v1 Announce Type: new
Abstract: Transformers, diffusion-maps, and magnetic Laplacians are usually treated as separate tools; we show they are all different regimes of a single Markov geometry built from pre-softmax query-scores. We define a QK "bidivergence" whose exponentiated and normalized forms yield attention, diffusion-maps, and magnetic diffusion. And use product of experts and Schr\"odinger-bridges to connect and organize them into equilibrium, nonequilibrium steady-state, and driven dynamics.