Self-Routing: Parameter-Free Expert Routing from Hidden States
arXiv:2604.00421v1 Announce Type: new Abstract: Mixture-of-Experts (MoE) layers increase model capacity by activating only a small subset of experts per token, and typically rely on …
Jama Hussein Mohamud, Drew Wagner, Mirco Ravanelli
2 views