Benchmarking Distilled Language Models: Performance and Efficiency in Resource-Constrained Settings
arXiv:2602.20164v1 Announce Type: new Abstract: Knowledge distillation offers a transformative pathway to developing powerful, yet efficient, small language models (SLMs) suitable for resource-constrained environments. In …
Sachin Gopal Wani, Eric Page, Ajay Dholakia, David Ellison
1 views