Geometric Entropy and Retrieval Phase Transitions in Continuous Thermal Dense Associative Memory
arXiv:2604.07401v2 Announce Type: replace-cross
Abstract: We study the thermodynamic memory capacity of modern Hopfield networks (Dense Associative Memory models) with continuous states under geometric constraints, extending classical analyses of pairwise associative memory. We derive thermodynamic phase boundaries for Dense Associative Memory networks with exponential capacity $M = e^{\alpha N}$, comparing Gaussian (LSE) and Epanechnikov (LSR) kernels. For continuous neurons on an $N$-sphere, the geometric entropy depends solely on the spherical geometry, not the kernel. In the sharp-kernel regime, the maximum theoretical capacity $\alpha = 0.5$ is achieved at zero temperature; below this threshold, a critical line separates retrieval from non-retrieval. The two kernels differ qualitatively in their phase boundary structure: for LSE, a critical line exists at all loads $\alpha > 0$. For LSR, the finite support introduces a threshold $\alpha_{\text{th}}$ below which no spurious patterns contribute to the noise floor, and no critical line exists -- retrieval is perfect at any temperature. These results advance the theory of high-capacity associative memory and clarify fundamental limits of retrieval robustness in modern attention-like memory architectures.