Loading [a11y]/accessibility-menu.js
Quantization-Aware Neural Architecture Search with Hyperparameter Optimization for Industrial Predictive Maintenance Applications | IEEE Conference Publication | IEEE Xplore

Quantization-Aware Neural Architecture Search with Hyperparameter Optimization for Industrial Predictive Maintenance Applications


Abstract:

Optimizing the efficiency of neural networks is cru-cial for ubiquitous machine learning on the edge. However, it requires specialized expertise to account for the wide v...Show More

Abstract:

Optimizing the efficiency of neural networks is cru-cial for ubiquitous machine learning on the edge. However, it requires specialized expertise to account for the wide variety of applications, edge devices, and deployment scenarios. An attractive approach to mitigate this bottleneck is Neural Architecture Search (NAS), as it allows for optimizing networks for both efficiency and task performance. This work shows that including hyperparameter optimization for training-related parameters alongside NAS enables substantial improvements in efficiency and task performance on a predictive maintenance task. Furthermore, this work extends the combination of NAS and hyperparameter optimization with INT8 quantization to enhance efficiency further. Our combined approach, which we refer to as Quantization-Aware NAS (QA-NAS), allows for further improvements in efficiency on the predictive maintenance task. Consequently, our work shows that QA-NAS is a promising research direction for optimizing neural networks for deployment on resource-constrained edge devices in industrial applications.
Date of Conference: 17-19 April 2023
Date Added to IEEE Xplore: 02 June 2023
Print on Demand(PoD) ISBN:979-8-3503-9624-9

ISSN Information:

Conference Location: Antwerp, Belgium

Contact IEEE to Subscribe

References

References is not available for this document.