Do Domain-specific Experts exist in MoE-based LLMs?
arXiv:2604.05267v1 Announce Type: new Abstract: In the era of Large Language Models (LLMs), the Mixture of Experts (MoE) architecture has emerged as an effective approach …
Giang Do, Hung Le, Truyen Tran
12 views