cs.LG

Learning How Much to Think: Difficulty-Aware Dynamic MoEs for Graph Node Classification

arXiv:2604.11473v1 Announce Type: new
Abstract: Mixture-of-Experts (MoE) architectures offer a scalable path for Graph Neural Networks (GNNs) in node classification tasks but typically rely on static and rigid routing strategies that enforce a uniform…