Skip to main content

EvoMLP: A Framework for Evolving Multilayer Perceptrons

  • Conference paper
  • First Online:
Advances in Computational Intelligence (IWANN 2021)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12862))

Included in the following conference series:

  • 808 Accesses

Abstract

Designing neural networks for classification or regression can be considered a search problem, and, as such, can be approached using different optimization procedures, all of them with several design challenges: The first and more important is to constrain the search space in such a way that proper solutions can be found in a reasonable amount of time; the second is to take into account that, depending on how the optimization procedure is formulated, the fitness score used for it can have a certain degree of uncertainty. This means that creating a framework for evolving neural networks for classification implies taking a series of decisions that range from the purely technical to the algorithmic at different levels: neural or the optimization framework chosen. This will be the focus of this paper, where we will introduce DeepGProp, a framework for genetic optimization of multilayer perceptrons that efficiently explores space of neural nets with different layers and layer size.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Martinez, A.D., et al.: Lights and shadows in evolutionary deep learning: taxonomy, critical methodological analysis, cases of study, learned lessons, recommendations and challenges. Inf. Fusion 67, 161–194 (2021)

    Article  Google Scholar 

  2. Castillo, P., Carpio, J., Merelo-Guervós, J.-J., Rivas, V., Romero, G., Prieto, A.: Evolving multilayer perceptrons. Neural Process. Lett. 12, 115–127 (2000). https://doi.org/10.1023/A:1009684907680

    Article  MATH  Google Scholar 

  3. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  4. Miikkulainen, R., et al.: Evolving deep neural networks. In: Artificial Intelligence in the Age of Neural Networks and Brain Computing. Elsevier, pp. 293–312 (2019)

    Google Scholar 

  5. Qolomany, B., Maabreh, M., Al-Fuqaha, A., Gupta, A., Benhaddou, D.: Parameters optimization of deep learning models using particle swarm optimization. In: 13th International Wireless Communications and Mobile Computing Conference (IWCMC), vol. 2017, pp. 1285–1290. IEEE (2017)

    Google Scholar 

  6. Castillo, P.A., Merelo-Guervós, J.-J., Prieto, A., Rivas, V., Romero, G.: G-Prop: Global optimization of multilayer perceptrons using GAs. Neurocomputing 35, 149–163 (2000). https://doi.org/10.1016/S0925-2312(00)00302-7, http://geneura.ugr.es/pub/papers/castilloNC.ps.gz

  7. Williams, D.R.G.H.R.: Learning internal representations by error propagation. Parallel Distrib. Proccess. 1, 310–362 (1986)

    Google Scholar 

  8. Bottou, L.: Stochastic gradient descent tricks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 421–436. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_25

    Chapter  Google Scholar 

  9. Merelo, J.J., Romero, G., Arenas, M.G., Castillo, P.A., Mora, A.M., Laredo, J.L.J.: Implementation matters: programming best practices for evolutionary algorithms. In: Cabestany, J., Rojas, I., Joya, G. (eds.) IWANN 2011. LNCS, vol. 6692, pp. 333–340. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-21498-1_42

    Chapter  Google Scholar 

  10. Alpaydin, E.: GAL: networks that grow when they learn and shrink when they forget. Int. J. Pattern Recognit Artif Intell. 8(01), 391–414 (1994)

    Article  Google Scholar 

  11. Balakrishnan, K., Honavar, V.: Evolutionary design of neural architectures - a preliminary taxonomy and guide to literature, AI Research Group, Technical report January 1995, cS-TR 95–01

    Google Scholar 

  12. Castillo, P. A.: Lamarckian evolution and the Baldwin effect in evolutionary neural networks (2006). http://www.citebase.org/abstract?id=oai:arXiv.org:cs/0603004

  13. Merelo, J.J., et al.: There is noisy lunch: a study of noise in evolutionary optimization problems. In: Rosa, A.C., (eds.) Proceedings of the 7th International Joint Conference on Computational Intelligence (IJCCI 2015), vol. 1, ECTA, Lisbon, Portugal, 12–14 November 2015, pp. 261–268. SciTePress (2015) https://doi.org/10.5220/0005600702610268

  14. Ecer, F., Ardabili, S., Band, S.S., Mosavi, A.: Training multilayer perceptron with genetic algorithms and particle swarm optimization for modeling stock price index prediction. Entropy 22(11), 1239 (2020)

    Article  MathSciNet  Google Scholar 

  15. Tajmiri, S., Azimi, E., Hosseini, M.R., Azimi, Y.: Evolving multilayer perceptron, and factorial design for modelling and optimization of dye decomposition by bio-synthetized nano cds-diatomite composite. Environ. Res. 182, 108997 (2020). http://www.sciencedirect.com/science/article/pii/S0013935119307947

  16. Senhaji, K., Ramchoun, H., Ettaouil, M.: Training feedforward neural network via multiobjective optimization model using non-smooth l1/2 regularization. Neurocomputing 410, 1–11 (2020). https://www.sciencedirect.com/science/article/pii/S0925231220309115

  17. Mazzawi, H., Gonzalvo, X.: Introducing model search: An open source platform for finding optimal ml models, Google AI blog. https://ai.googleblog.com/2021/02/introducing-model-search-open-source.html. February 2021

  18. Merelo-Guervós, J.-J., et al.: Evolving objects. In: Wang, P.P., (ed.) Proceedings of JCIS 2000 (Joint Conference on Information Sciences), vol. I, 2000, pp. 1083–1086. ISBN: 0-9643456-9-2

    Google Scholar 

  19. Faris, H., et al.: Evolopy: An open-source nature-inspired optimization framework in python. In: Guervós, J. J. M., et al. (eds.) Proceedings of the 8th International Joint Conference on Computational Intelligence, IJCCI 2016, Vol. 1: ECTA, Porto, Portugal, 9–11 November 2016, SciTePress, pp. 171–177 (2016) https://doi.org/10.5220/0006048201710177

  20. Fortin, F.-A., De Rainville, F.-M., Gardner, M.-A., Parizeau, M., Gagné, C.: DEAP: evolutionary algorithms made easy. J. Mach. Learn. Res. 13, 2171–2175 (2012)

    MathSciNet  Google Scholar 

  21. Kim, J., Yoo, S.: Software review: Deap (distributed evolutionary algorithm in python) library. Genetic Program. Evolvable Mach. 20(1), 139–142 (2019)

    Article  Google Scholar 

  22. Chollet, F., et al.: Keras. (2015) https://keras.io

  23. van der Walt, S., Colbert, S.C., Varoquaux, G.: The numpy array: a structure for efficient numerical computation. Comput. Sci. Eng. 13(2), 22–30 (2011)

    Article  Google Scholar 

  24. Villafranca, L.L., Guervós, J.J.M.: Deepgprop, Nov. (2020). https://doi.org/10.5281/zenodo.4287505

  25. Dua, D., Graff, C.: UCI machine learning repository. http://archive.ics.uci.edu/ml (2017)

  26. Duriqi, R., Raca, V., Cico, B.: Comparative analysis of classification algorithms on three different datasets using weka. In: 2016 5th Mediterranean Conference on Embedded Computing (MECO), pp. 335–338. IEEE (2016)

    Google Scholar 

  27. Prechelt, L.: PROBEN1 – A set of benchmarks and benchmarking rules for neural network training algorithms, Fakultät für Informatik, Universität Karlsruhe, D-76128 Karlsruhe, Germany. Technical report 21/94, (September 1994)

    Google Scholar 

  28. Merelo-Guervós, J.-J.: Using a Wilcoxon-test based partial order for selection in evolutionary algorithms with noisy fitness, GeNeura group, university of Granada, Technical report (2014). https://doi.org/10.6084/m9.figshare.974598

Download references

Acknowledgments

Supported in part by project DeepBio (TIN2017-85727-C4-2-P) and PID2020-115570GB-C22.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. J. Merelo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liñán-Villafranca, L., García-Valdez, M., Merelo, J.J., Castillo-Valdivieso, P. (2021). EvoMLP: A Framework for Evolving Multilayer Perceptrons. In: Rojas, I., Joya, G., Català, A. (eds) Advances in Computational Intelligence. IWANN 2021. Lecture Notes in Computer Science(), vol 12862. Springer, Cham. https://doi.org/10.1007/978-3-030-85099-9_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85099-9_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85098-2

  • Online ISBN: 978-3-030-85099-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics