Skip to main content

Comparing Evolutionary Artificial Neural Networks from Second and Third Generations for Solving Supervised Classification Problems

  • Chapter
  • First Online:
Book cover Intuitionistic and Type-2 Fuzzy Logic Enhancements in Neural and Optimization Algorithms: Theory and Applications

Abstract

Constituting nature-inspired computational systems, Artificial Neural Networks (ANNs) are generally classified into several generations depending on the features and capabilities of their neuron models. As generations develop, newer models of ANNs portrait more plausible properties than their predecessors, accounting for closer resemblance to biological neurons or for augmentations in their problem-solving abilities. Evolutionary Artificial Neural Networks (EANNs) is a paradigm to design ANNs involving Evolutionary Algorithms (EAs) to determine inherent aspects of the networks such as topology or parameterization, while prescinding—totally or partially—from expert proficiency. In this paper a comparison of the performance of evolutionary-designed ANNs from the second and third generations is made. An EA-based technique known as Grammatical Evolution (GE) is used to automatically design ANNs for solving supervised classification problems. Partially-connected three-layered feedforward topologies and synaptic connections for both types of considered ANNs are determined by the evolutionary process of GE; an explicit training task is not necessary. The proposed framework was tested on several well-known benchmark datasets, providing relevant and consistent results; accuracies exhibited by third-generation ANNs matched or bested those from second-generation ANNs. Furthermore, produced networks achieved a considerable reduction in the amount of existing synapses, as in comparison with equivalent fully-connected topologies, and a lower usage of traits from the input vector.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Amaldi, E., Mayoraz, E., de Werra, D.: A review of combinatorial problems arising in feedforward neural network design. Discrete Appl. Math. 52(2), 111–138 (1994)

    Article  MathSciNet  Google Scholar 

  2. Belatreche, A., Maguire, L.P., Mcginnity, M., Wu, Q.X.: An evolutionary strategy for supervised training of biologically plausible neural networks. In: Proceedings of the Sixth International Conference on Computational Intelligence and Natural Computing, pp. 1524–1527 (2003)

    Google Scholar 

  3. Belatreche, A., Maguire, L.P., McGinnity, T.M.: Advances in design and application of spiking neural networks. Soft Comput. 11(3), 239–248 (2007)

    Article  Google Scholar 

  4. Dheeru, D., Karra Taniskidou, E.: UCI Machine Learning Repository (2017)

    Google Scholar 

  5. Ding, S., Li, H., Su, C., Yu, J., Jin, F.: Evolutionary artificial neural networks: a review. Artif. Intell. Rev. 39(3), 251–260 (2013)

    Article  Google Scholar 

  6. Elfwing, S., Uchibe, E., Doya, K.: Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. Neural Netw. (2018)

    Google Scholar 

  7. Elizondo, D., Fiesler, E.: A survey of partially connected neural networks. Int. J. Neural Syst. 8(5–6), 535–558 (1997)

    Article  Google Scholar 

  8. Espinal, A., Carpio, M., Ornelas, M., Puga, H., Melin, P., Sotelo-Figueroa, M.: Comparing metaheuristic algorithms on the training process of spiking neural networks. In: Recent Advances on Hybrid Approaches for Designing Intelligent Systems, pp. 391–403. Springer (2014)

    Google Scholar 

  9. Espinal, A., Carpio, M., Ornelas, M., Puga, H., Melín, P., Sotelo-Figueroa, M.: Developing architectures of spiking neural networks by using grammatical evolution based on evolutionary strategy. In: Mexican Conference on Pattern Recognition, pp. 71–80. Springer (2014)

    Google Scholar 

  10. Gerstner, W., Kistler, W.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press (2002)

    Google Scholar 

  11. Ghosh-Dastidar, S., Adeli, H.: Spiking neural networks. Int. J. Neural Syst. 19(04), 295–308 (2009)

    Article  Google Scholar 

  12. Gnedenko, B.V., Kolmogorov, A.N.: Limit distributions for sums of independent random variables. In: Predelnye raspredeleniia dlia summ, No. ix, 264 p. Addison-Wesley Pub. Co., Cambridge, Mass (1954)

    Google Scholar 

  13. Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117(4), 500 (1952)

    Article  Google Scholar 

  14. Ijspeert, A.J.: Central pattern generators for locomotion control in animals and robots: a review. Neural Netw. 21(4), 642–653 (2008)

    Article  Google Scholar 

  15. Judd, J.S.: On the complexity of loading shallow neural networks. J. Complex. 4(3), 177–192 (1988)

    Article  MathSciNet  Google Scholar 

  16. Judd, J.S.: Neural Network Design and the Complexity of Learning. Neural Network Modeling and Connectionism Series. MIT Press, Cambridge, MA (1990)

    Google Scholar 

  17. Lapicque, L.: Recherches quantitatives sur l’excitation electrique des nerfs traitee comme une polarization. Journal de Physiologie et de Pathologie Generalej 9, 620–635 (1907)

    Google Scholar 

  18. Maass, W.: Networks of spiking neurons: the third generation of neural network models. Neural Netw. 10(9), 1659–1671 (1997)

    Article  Google Scholar 

  19. Maass, W., Schmitt, M.: On the complexity of learning for spiking neurons with temporal coding. Inf. Comput. 153(1), 26–46 (1999)

    Article  MathSciNet  Google Scholar 

  20. Markou, M., Singh, S.: Novelty detection: a review-part 2: neural network based approaches. Sig. Process. 83(12), 2499–2521 (2003)

    Article  Google Scholar 

  21. McCulloch, W.S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biology 5(4), 115–133 (1943)

    MathSciNet  MATH  Google Scholar 

  22. Montgomery, D.C.: Design and Analysis of Experiments (2013)

    Google Scholar 

  23. Ojha, V.K., Abraham, A., Snášel, V.: Metaheuristic design of feedforward neural networks: a review of two decades of research. Eng. Appl. Artif. Intell. 60, 97–116 (2017)

    Article  Google Scholar 

  24. O’Neill, M., Ryan, C.: Grammatical evolution. Trans. Evol. Comp. 5(4), 349–358 (2001)

    Article  Google Scholar 

  25. Quiroz-Ramírez, O., Espinal, A., Ornelas-Rodríguez, M., Rojas-Domínguez, A., Sánchez, D., Puga-Soberanes, H., Carpio, M., Espinoza, L.E.M., Ortíz-López, J.: Partially-connected artificial neural networks developed by grammatical evolution for pattern recognition problems. Stud. Comput. Intell. 749, 99–112 (2018)

    Google Scholar 

  26. Rosenblatt, E.: The Perceptron, A Perceiving And Recognizing Automaton (Project PARA). Cornell Aeronautical Laboratory (1957)

    Google Scholar 

  27. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533 (1986)

    Article  Google Scholar 

  28. Ryan, C., Collins, J., O’Neill, M.: Grammatical evolution: evolving programs for an arbitrary language. In: Proceedings of Genetic Programming: First European Workshop. EuroGP’98, Paris, France, 14–15 April 1998, pp. 83–96. Springer, Berlin, Heidelberg (1998)

    Google Scholar 

  29. Scarselli, F., Tsoi, A.C.: Universal approximation using feedforward neural networks: a survey of some existing methods, and some new results. Neural Netw. 11(1), 15–37 (1998)

    Article  Google Scholar 

  30. Shapiro, S.S., Wilk, M.B.: An analysis of variance test for normality (complete samples). Biometrika 52(3–4), 591–611 (1965)

    Article  MathSciNet  Google Scholar 

  31. Talbi, E.-G.: Metaheuristics: from Design to Implementation. Wiley, Hoboken, NJ, (2009). OCLC: ocn230183356

    Google Scholar 

  32. Yao, X.: Evolving artificial neural networks. Proc. IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

  33. Yu, J., Tan, M., Chen, J., Zhang, J.: A survey on CPG-inspired control models and system implementation. IEEE Trans. Neural Netw. Learning Syst. 3, 441–456 (2014)

    Article  Google Scholar 

  34. Zhang, G.P.: Neural networks for classification: a survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 30(4), 451–462 (2000)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

Authors wish to thank National Technology of Mexico and University of Guanajuato. G. López-Vázquez and A. Rojas-Domínguez thank to the National Council of Science and Technology of Mexico (CONACYT) for the support provided by means of the Scholarship for Postgraduate Studies (701071) and Research Grant (CÁTEDRAS-2598), respectively. This work was supported by the CONACYT Project FC2016-1961 “Neurociencia Computacional: de la teoría al desarrollo de sistemas neuromórficos”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manuel Ornelas-Rodríguez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

López-Vázquez, G. et al. (2020). Comparing Evolutionary Artificial Neural Networks from Second and Third Generations for Solving Supervised Classification Problems. In: Castillo, O., Melin, P., Kacprzyk, J. (eds) Intuitionistic and Type-2 Fuzzy Logic Enhancements in Neural and Optimization Algorithms: Theory and Applications. Studies in Computational Intelligence, vol 862. Springer, Cham. https://doi.org/10.1007/978-3-030-35445-9_42

Download citation

Publish with us

Policies and ethics