Self-Attention And Beyond the Infinite: Towards Linear Transformers with Infinite Self-Attention
arXiv:2603.00175v5 Announce Type: replace
Abstract: The quadratic cost of softmax attention limits Transformer scalability in high-resolution vision. We introduce Infinite Self-Attention (InfSA), a spectral reformulation that treats each attention lay…