$\mathcal{X}$-KD: General Experiential Knowledge Distillation for Large Language Models
arXiv:2602.12674v1 Announce Type: new Abstract: Knowledge Distillation (KD) for Large Language Models (LLMs) has become increasingly important as models grow in size and complexity. While …
Yuang Cai, Yuyu Yuan
44 views