cs.LG, q-bio.QM

Dual Triangle Attention: Effective Bidirectional Attention Without Positional Embeddings

arXiv:2604.18603v1 Announce Type: cross
Abstract: Bidirectional transformers are the foundation of many sequence modeling tasks across natural, biological, and chemical language domains, but they are permutation-invariant without explicit positional e…