To read this content please select one of the options below:

A progressive genetic-based neural architecture search

Yi-Cheng Chen (Information Management, National Central University, Taoyuan, Taiwan)

Industrial Management & Data Systems

ISSN: 0263-5577

Article publication date: 28 February 2022

Issue publication date: 15 March 2022

153

Abstract

Purpose

Recently, more and more attention has been put forth on the application and deep learning, due to the widespread practicability of neural network computation. The purpose of this paper is developing an effective algorithm to automatically discover the optimal neural network architecture for several real applications.

Design/methodology/approach

The author proposes a novel algorithm, namely, progressive genetic-based neural architecture search (PG-NAS), as a solution to efficiently find the optimal neural network structure for given data. PG-NAS also employs several operations to effectively shrink the search space to reduce the computation cost and improve the accuracy validation.

Findings

The proposed PG-NAS could be utilized on several tasks for discovering the optimal network structure. The author reduces the demand of manual settings when implementing artificial intelligence (AI) models; hence, PG-NAS requires less human intervention than traditional machine learning. The average and top-1 metrics, such as error, loss and accuracy, are used to measure the discovered neural architectures of the proposed model over all baselines. The experimental results show that, with several real datasets, the proposed PG-NAS model consistently outperforms the state-of-the-art models in all metrics.

Originality/value

Generally, the size and the complexity of the search space for the neural network dominates the performance of computation time and resources. In this study, PG-NAS utilizes genetic operations to effectively generate the compact candidate set, i.e. fewer combinations need to be generated when constructing the candidate set. Moreover, by the proposed selector in PG-NAS, the non-promising network structure could be significantly pruned off. In addition, the accuracy derivation of each combination in the candidate set is also a performance bottleneck. The author develops a predictor network to efficiently estimate the accuracy to avoid the time-consuming derivation. The learning of the prediction process is also adjusted dynamically; this adaptive learning of the predictor could capture the pattern of training data effectively and efficiently. Furthermore, the proposed PG-NAS algorithm is applied on several real datasets to show its practicability and scalability.

Keywords

Acknowledgements

Author Yi-Cheng Chen was supported by the Ministry of Science and Technology, Project No. 108-2221-E-008 -063 -MY3.

Citation

Chen, Y.-C. (2022), "A progressive genetic-based neural architecture search", Industrial Management & Data Systems, Vol. 122 No. 3, pp. 645-665. https://doi.org/10.1108/IMDS-05-2021-0323

Publisher

:

Emerald Publishing Limited

Copyright © 2021, Emerald Publishing Limited

Related articles