Routing-Free Mixture-of-Experts
arXiv:2604.00801v1 Announce Type: cross
Abstract: Standard Mixture-of-Experts (MoE) models rely on centralized routing mechanisms that introduce rigid inductive biases. We propose Routing-Free MoE which eliminates any hard-coded centralized designs in…