Enhancing Mixture-of-Experts Specialization via Cluster-Aware Upcycling
arXiv:2604.13508v2 Announce Type: replace
Abstract: Sparse Upcycling provides an efficient way to initialize a Mixture-of-Experts (MoE) model from pretrained dense weights instead of training from scratch. However, since all experts start from identic…