Skip to main content

Augmenting Novelty Search with a Surrogate Model to Engineer Meta-diversity in Ensembles of Classifiers

  • Conference paper
  • First Online:
Applications of Evolutionary Computation (EvoApplications 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13224))

Abstract

Using Neuroevolution combined with Novelty Search to promote behavioural diversity is capable of constructing high-performing ensembles for classification. However, using gradient descent to train evolved architectures during the search can be computationally prohibitive. Here we propose a method to overcome this limitation by using a surrogate model which estimates the behavioural distance between two neural network architectures required to calculate the sparseness term in Novelty Search. We demonstrate a speedup of 10 times over previous work and significantly improve on previous reported results on three benchmark datasets from Computer Vision—CIFAR-10, CIFAR-100, and SVHN. This results from the expanded architecture search space facilitated by using a surrogate. Our method represents an improved paradigm for implementing horizontal scaling of learning algorithms by making an explicit search for diversity considerably more tractable for the same bounded resources.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  Google Scholar 

  2. Cardoso, R.P., Hart, E., Kurka, D.B., Pitt, J.V.: Using novelty search to explicitly create diversity in ensembles of classifiers. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2021, pp. 849–857. Association for Computing Machinery, New York (2021)

    Google Scholar 

  3. Deng, B., Yan, J., Lin, D.: Peephole: predicting network performance before training. arXiv preprint arXiv:1712.03351 (2017)

  4. Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–15. Springer, Heidelberg (2000). https://doi.org/10.1007/3-540-45014-9_1

    Chapter  Google Scholar 

  5. Gaier, A., Asteroth, A., Mouret, J.B.: Data-efficient neuroevolution with kernel-based surrogate models. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 85–92 (2018)

    Google Scholar 

  6. Gomes, J., Mariano, P., Christensen, A.L.: Devising effective novelty search algorithms: a comprehensive empirical study. In: GECCO 2015 - Proceedings of the 2015 Genetic and Evolutionary Computation Conference (2015)

    Google Scholar 

  7. Hagg, A., Zaefferer, M., Stork, J., Gaier, A.: Prediction of neural network performance by phenotypic modeling. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO 2019, pp. 1576–1582. Association for Computing Machinery, New York (2019)

    Google Scholar 

  8. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  9. Chilès, J.-P., Desassis, N.: Fifty years of kriging. In: Daya Sagar, B.S., Cheng, Q., Agterberg, F. (eds.) Handbook of Mathematical Geosciences, pp. 589–612. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-78999-6_29

    Chapter  Google Scholar 

  10. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51, 181–207 (2003). https://doi.org/10.1023/A:1022859003006

  11. Lehman, J., Stanley, K.O.: Abandoning objectives: evolution through the search for novelty alone. Evol. Comput. 19, 189–223 (2011)

    Article  Google Scholar 

  12. Pasti, R., De Castro, L.N., Coelho, G.P., Von Zuben, F.J.: Neural network ensembles: immune-inspired approaches to the diversity of components. Nat. Comput. 9(3), 625–653 (2010)

    Article  MathSciNet  Google Scholar 

  13. Paszke, A., Gross, S., Chintala, S., et al.: Automatic differentiation in PyTorch. In: Advances in Neural Information Processing Systems 32 (2019)

    Google Scholar 

  14. Ruan, X., Li, K., Derbel, B., Liefooghe, A.: Surrogate assisted evolutionary algorithm for medium scale multi-objective optimisation problems. In: Proceedings of the 2020 Genetic and Evolutionary Computation Conference, pp. 560–568 (2020)

    Google Scholar 

  15. Siems, J., Zimmer, L., Zela, A., et al.: NAS-Bench-301 and the case for surrogate benchmarks for neural architecture search (2020)

    Google Scholar 

  16. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  17. Stork, J., Zaefferer, M., Bartz-Beielstein, T.: Improving NeuroEvolution efficiency by surrogate model-based optimization with phenotypic distance kernels. In: Kaufmann, P., Castillo, P.A. (eds.) EvoApplications 2019. LNCS, vol. 11454, pp. 504–519. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-16692-2_34

    Chapter  Google Scholar 

  18. Sun, Y., Wang, H., Xue, B., et al.: Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor. IEEE Trans. Evol. Comput. 24, 350–364 (2019)

    Article  Google Scholar 

  19. Tong, H., Huang, C., Minku, L.L., Yao, X.: Surrogate models in evolutionary single-objective optimization: a new taxonomy and experimental study. Inf. Sci. 562, 414–437 (2021)

    Article  MathSciNet  Google Scholar 

  20. Van Krevelen, R.: Error diversity in classification ensembles. Ph.D. thesis (2005)

    Google Scholar 

  21. Wolpert, D.H.: Stacked generalization. Neural Netw. 5, 241–259 (1992)

    Article  Google Scholar 

  22. Zagoruyko, S., Komodakis, N.: Wide residual networks (2016)

    Google Scholar 

  23. Zhou, Z., Ong, Y.S., Nair, P.B., et al.: Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 37(1), 66–76 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rui P. Cardoso .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cardoso, R.P., Hart, E., Kurka, D.B., Pitt, J. (2022). Augmenting Novelty Search with a Surrogate Model to Engineer Meta-diversity in Ensembles of Classifiers. In: Jiménez Laredo, J.L., Hidalgo, J.I., Babaagba, K.O. (eds) Applications of Evolutionary Computation. EvoApplications 2022. Lecture Notes in Computer Science, vol 13224. Springer, Cham. https://doi.org/10.1007/978-3-031-02462-7_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-02462-7_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-02461-0

  • Online ISBN: 978-3-031-02462-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics