Abstract
Differential Evolution for Neural Networks (DENN) is an optimizer for neural network weights based on Differential Evolution. Although DENN has shown good performance with middle-size networks, the number of weights is an evident limitation of the approach. The aim of this work is to figure out if coevolutionary strategies implemented on top of DENN could be of help during the optimization phase. Moreover, we studied two of the classical problems connected to the application of evolutionary computation, i.e. the stagnation and the lack of population diversity, and the use of a crowding strategy to address them. The system has been tested on classical benchmark classification problems and experimental results are presented and discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baioletti, M., Di Bari, G., Milani, A., Poggioni, V.: Differential evolution for neural networks optimization (2019, To appear)
Baioletti, M., Belli, V., Di Bari, G., Poggioni, V.: Neural random access machines optimized by differential evolution. In: Ghidini, C., Magnini, B., Passerini, A., Traverso, P. (eds.) AI*IA 2018. LNCS (LNAI), vol. 11298, pp. 307–319. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-03840-3_23
Baioletti, M., Di Bari, G., Poggioni, V., Tracolli, M.: Can differential evolution be an efficient engine to optimize neural networks? In: Nicosia, G., Pardalos, P., Giuffrida, G., Umeton, R. (eds.) MOD 2017. LNCS, vol. 10710, pp. 401–413. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-72926-8_33
Das, S., Mullick, S.S., Suganthan, P.: Recent advances in differential evolution - an updated survey. Swarm Evol. Comput. 27, 1–30 (2016)
Das, S., Suganthan, P.N.: Differential evolution: a survey of the state-of-the-art. IEEE Trans. Evol. Comput. 15(1), 4–31 (2011)
De Jong, K.A.: Analysis of the behavior of a class of genetic adaptive systems (1975)
Eltaeib, T., Mahmood, A.: Differential evolution: a survey and analysis. Appl. Sci. 8(10) (2018). https://www.mdpi.com/2076-3417/8/10/1945
Morse, G., Stanley, K.O.: Simple evolutionary optimization can rival stochastic gradient descent in neural networks. In: GECCO 2016, pp. 477–484. ACM (2016)
Olorunda, O., Engelbrecht, A.P.: Differential evolution in high-dimensional search spaces. In: Proceedings of CEC, 2007, pp. 1934–1941 (2007)
Piotrowski, A.P.: Differential evolution algorithms applied to neural network training suffer from stagnation. Appl. Soft Comput. 21, 382–406 (2014)
Potter, M.A., De Jong, K.A.: A cooperative coevolutionary approach to function optimization. In: Davidor, Y., Schwefel, H.-P., Männer, R. (eds.) PPSN 1994. LNCS, vol. 866, pp. 249–257. Springer, Heidelberg (1994). https://doi.org/10.1007/3-540-58484-6_269
Prellberg, J., Kramer, O.: Limited evaluation evolutionary optimization of large neural networks. In: Trollmann, F., Turhan, A.-Y. (eds.) KI 2018. LNCS (LNAI), vol. 11117, pp. 270–283. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00111-7_23
Thomsen, R.: Multimodal optimization using crowding-based differential evolution. In: Proceedings of CEC 2004, vol. 2, pp. 1382–1389 (2004)
Yaman, A., Mocanu, D.C., Iacca, G., Fletcher, G., Pechenizkiy, M.: Limited evaluation cooperative co-evolutionary differential evolution for large-scale neuroevolution. In: Proceedings of GECCO 2018, pp. 569–576. ACM (2018)
Yang, Z., Tang, K., Yao, X.: Large scale evolutionary optimization using cooperative coevolution. Inf. Sci. 178(15), 2985–2999 (2008)
Zhang, X., Clune, J., Stanley, K.O.: On the relationship between the OpenAI evolution strategy and stochastic gradient descent. arXiv preprint arXiv:1712.06564 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Baioletti, M., Di Bari, G., Poggioni, V. (2020). An Analysis of Cooperative Coevolutionary Differential Evolution as Neural Networks Optimizer. In: Cicirelli, F., Guerrieri, A., Pizzuti, C., Socievole, A., Spezzano, G., Vinci, A. (eds) Artificial Life and Evolutionary Computation. WIVACE 2019. Communications in Computer and Information Science, vol 1200. Springer, Cham. https://doi.org/10.1007/978-3-030-45016-8_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-45016-8_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-45015-1
Online ISBN: 978-3-030-45016-8
eBook Packages: Computer ScienceComputer Science (R0)