SEval-NAS: A Search-Agnostic Evaluation for Neural Architecture Search
arXiv:2603.00099v1 Announce Type: new Abstract: Neural architecture search (NAS) automates the discovery of neural networks that meet specified criteria, yet its evaluation procedures are often hardcoded, limiting the ability to introduce new metrics. This issue is especially pronounced in hardware-aware NAS, where objectives depend on target devices such as edge hardware. To address this limitation, we propose SEval-NAS, a metric-evaluation mechanism that converts architectures to strings, embeds them as vectors, and predicts performance metrics. Using NATS-Bench and HW-NAS-Bench, we evaluated accuracy, latency, and memory. Kendall's $\tau$ correlations showed stronger latency and memory predictions than accuracy, indicating the suitability of SEval-NAS as a hardware cost predictor. We further integrated SEval-NAS into FreeREA to evaluate metrics not originally included. The method successfully ranked FreeREA-generated architectures, maintained search time, and required minimal algor
arXiv:2603.00099v1 Announce Type: new Abstract: Neural architecture search (NAS) automates the discovery of neural networks that meet specified criteria, yet its evaluation procedures are often hardcoded, limiting the ability to introduce new metrics. This issue is especially pronounced in hardware-aware NAS, where objectives depend on target devices such as edge hardware. To address this limitation, we propose SEval-NAS, a metric-evaluation mechanism that converts architectures to strings, embeds them as vectors, and predicts performance metrics. Using NATS-Bench and HW-NAS-Bench, we evaluated accuracy, latency, and memory. Kendall's $\tau$ correlations showed stronger latency and memory predictions than accuracy, indicating the suitability of SEval-NAS as a hardware cost predictor. We further integrated SEval-NAS into FreeREA to evaluate metrics not originally included. The method successfully ranked FreeREA-generated architectures, maintained search time, and required minimal algorithmic changes. Our implementation is available at: https://github.com/Analytics-Everywhere-Lab/neural-architecture-search
Executive Summary
This article proposes SEval-NAS, a novel metric-evaluation mechanism for neural architecture search (NAS). SEval-NAS converts architectures to strings, embeds them as vectors, and predicts performance metrics such as accuracy, latency, and memory. The authors demonstrate the efficacy of SEval-NAS using NATS-Bench and HW-NAS-Bench, showcasing its potential as a hardware cost predictor. Furthermore, they integrate SEval-NAS into FreeREA, a popular NAS framework, and demonstrate its ability to rank generated architectures without compromising search time. The authors' implementation is available on GitHub, providing a valuable resource for the research community. SEval-NAS offers a flexible and adaptable evaluation framework for NAS, addressing the limitation of hardcoded evaluation procedures.
Key Points
- ▸ SEval-NAS proposes a metric-evaluation mechanism for NAS, enabling flexible and adaptable evaluation procedures.
- ▸ SEval-NAS uses string embeddings and vector representation to predict performance metrics.
- ▸ SEval-NAS demonstrates strong latency and memory predictions, making it suitable as a hardware cost predictor.
- ▸ SEval-NAS is integrated into FreeREA, a popular NAS framework, with minimal algorithmic changes.
Merits
Flexibility and Adaptability
SEval-NAS offers a flexible evaluation framework that can adapt to new metrics and objectives, addressing the limitation of hardcoded evaluation procedures.
Strong Predictive Performance
SEval-NAS demonstrates strong predictive performance for latency and memory metrics, making it suitable as a hardware cost predictor.
Ease of Integration
SEval-NAS can be integrated into existing NAS frameworks with minimal algorithmic changes, making it a practical solution for researchers and practitioners.
Demerits
Limited Evaluation Metrics
SEval-NAS is currently limited to evaluating accuracy, latency, and memory metrics, and may not be suitable for evaluating other performance metrics.
Dependence on String Embeddings
SEval-NAS relies on string embeddings and vector representation, which may not be effective for all types of neural architectures or evaluation metrics.
Expert Commentary
SEval-NAS is a significant contribution to the field of neural architecture search, addressing the limitation of hardcoded evaluation procedures. The authors' innovative approach to predicting performance metrics using string embeddings and vector representation demonstrates the potential of SEval-NAS as a hardware cost predictor. However, further research is needed to evaluate the effectiveness of SEval-NAS for a broader range of performance metrics and neural architectures. Additionally, the dependence on string embeddings and vector representation may limit the applicability of SEval-NAS in certain scenarios. Nevertheless, SEval-NAS offers a promising solution for evaluating and comparing neural architectures, and its implementation on GitHub provides a valuable resource for the research community.
Recommendations
- ✓ Researchers and practitioners should explore the application of SEval-NAS in various domains, including computer vision, natural language processing, and speech recognition.
- ✓ Future research should focus on evaluating the effectiveness of SEval-NAS for a broader range of performance metrics and neural architectures, and addressing the limitations of string embeddings and vector representation.