skip to main content
10.1145/3659154.3659160acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiceaConference Proceedingsconference-collections
research-article

Composing adaptive score functions using genetic programming for neural architecture search

Published: 26 December 2024 Publication History

Abstract

Most of the computation resources are spent on evaluating the candidate architectures in neural architecture search (NAS). A novel predictor will be presented in this study, with the core idea of using genetic programming (GP) to automatically compose adaptive score functions that predict the performance of candidate architectures. By training symbolic regressors with test accuracy and various metrics as training data, it has the ability to distinguish promising architectures from less favorable ones. This method achieves generalization across different datasets and increases the rank correlation between the “score value” and “accuracy” of a neural architecture. To evaluate the performance of the proposed algorithm, we compare it with other state-of-the-art training-free NAS score functions. The simulation results demonstrate that the proposed algorithm is able to find better architecture than other training-free NAS score functions that are compared in this study.

References

[1]
B. Zoph and Q. V. Le, “Neural architecture search with reinforcement learning,” arXiv preprint, pp. 1–16, 2016.
[2]
Y. Sun, B. Xue, M. Zhang, and G. G. Yen, “Evolving deep convolutional neural networks for image classification,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 2, pp. 394–407, 2019.
[3]
J. Mellor, J. Turner, A. Storkey, and E. J. Crowley, “Neural architecture search without training,” in Proceedings of the International Conference on Machine Learning, 2021, pp. 7588–7598.
[4]
H. Tanaka, D. Kunin, D. L. Yamins, and S. Ganguli, “Pruning neural networks without any data by iteratively conserving synaptic flow,” in Proceedings of the Advances in Neural Information Processing Systems, 2020, pp. 6377–6389.
[5]
D. Augusto and H. Barbosa, “Symbolic regression via genetic programming,” in Proceedings of the Brazilian Symposium on Neural Networks, 2000, pp. 173–178.
[6]
C. Ying, A. Klein, E. Christiansen, E. Real, K. Murphy, and F. Hutter, “NAS-bench-101: Towards reproducible neural architecture search,” in Proceedings of the International Conference on Machine Learning, 2019, pp. 7105–7114.
[7]
X. Dong and Y. Yang, “Nas-bench-201: Extending the scope of reproducible neural architecture search,” in Proceedings of the International Conference on Learning Representations, 2020, pp. 1–16.
[8]
X. Dong, L. Liu, K. Musial, and B. Gabrys, “Nats-bench: Benchmarking nas algorithms for architecture topology and size,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 7, pp. 3634–3646, 2022.
[9]
H. Jin, Q. Song, and X. Hu, “Auto-keras: An efficient neural architecture search system,” in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019, pp. 1946–1956.
[10]
M.-T. Wu, H.-I. Lin, and C.-W. Tsai, “A training-free genetic neural architecture search,” in Proceedings of the ACM International Conference on Intelligent Computing and its Emerging Applications, 2022, pp. 65–70.
[11]
N. Cavagnero, L. Robbiano, B. Caputo, and G. Averta, “FreeREA: Training-free evolution-based architecture search,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2023, pp. 1493–1502.
[12]
E. Real, A. Aggarwal, Y. Huang, and Q. V. Le, “Regularized evolution for image classifier architecture search,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2019, pp. 4780–4789.
[13]
W. Wen, H. Liu, Y. Chen, H. Li, G. Bender, and P.-J. Kindermans, “Neural predictor for neural architecture search,” arXiv preprint, pp. 660–676, 2020.
[14]
B. Pavlyshenko, “Using stacking approaches for machine learning models,” in Proceedings of IEEE International Conference on Data Stream Mining & Processing, 2018, pp. 255–258.
[15]
M.-T. Wu, H.-I. Lin, and C.-W. Tsai, “A training-free neural architecture search algorithm based on search economics,” IEEE Transactions on Evolutionary Computation, pp. 1–15, 2023, In Press.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICEA '23: Proceedings of the 2023 International Conference on Intelligent Computing and Its Emerging Applications
December 2023
175 pages
ISBN:9798400709050
DOI:10.1145/3659154
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 December 2024

Check for updates

Author Tags

  1. Neural architectures search
  2. genetic programming
  3. and training-free NAS.

Qualifiers

  • Research-article

Funding Sources

  • National Science and Technology Council (NSTC) of Taiwan
  • National Science and Technology Council (NSTC) of Taiwan

Conference

ICEA 2023

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 19
    Total Downloads
  • Downloads (Last 12 months)19
  • Downloads (Last 6 weeks)9
Reflects downloads up to 19 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media