Skip to main content

An evolutionary algorithm for designing feedforward neural networks

  • Conference paper
  • First Online:
Evolutionary Programming VII (EP 1998)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1447))

Included in the following conference series:

Abstract

This paper presents a new approach to evolutionary artificial neural networks, based on the integration of feedforward neural networks, messy genetic algorithms (GAs), and singular value decomposition (SVD). The set of competing hidden nodes with variable number of connections from the input layer represents an evolving neural network. Selection of hidden nodes is based on their estimation via SVD. The resulting singular values are used to determine significance of hidden nodes for the network's output. To represent connectivity of hidden nodes and to process the topology of connections between input and hidden layers, we employ the approach of messy GAs. This establishes a framework for processing strings of variable length which codes this topology and allows one to search for useful combinations of input variables. The proposed approach is tested using sonar data classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Schaffer, J.D., Whitley, D., and Eshelman, L.J., “Combinations of genetic algorithms and neural networks: A survey of the state of the art,” Combinations of Genetic Algorithms and Neural Networks (Whitley, L.D., Schaffer, J.D., eds.), 1–37, Los Alamitos, CA: IEEE Computer Society Press, June 1992.

    Google Scholar 

  2. Yao, X., “A review of evolutionary artificial neural networks,” International Journal of Intelligent Systems, 8(4), 539–567, 1993.

    Google Scholar 

  3. Smith, R., Cribbs III, H.B., “Is a learning classifier system a type of neural network?,” Evolutionary Computation, 2(1), 19–36, 1994.

    Google Scholar 

  4. Smalz, R., Conrad, M., “Combining evolution with credit apportionment: a new learning algorithm for neural nets,” Neural Networks, 7(2), 341–351, 1994.

    Google Scholar 

  5. Finnoff, W., Hergert, F., Zimmermann, H.G., “Improving model selection by nonconvergent methods,” Neural Networks, 6, 771–783.

    Google Scholar 

  6. Fahlman, S.E. and Lebiere, C., “The cascade-correlation learning architecture,” Advances in Neural Information Processing Systems, 2, (Lippmann, R.P., Moody, J.E., Touretzky, D.S., eds.), 524–532, San Francisco: Morgan Kaufmann, 1991.

    Google Scholar 

  7. Goldberg, D., Deb, K., and Korb, B., “Messy genetic algorithms: Motivation, analysis, and first results,” Complex Systems, 3, 493–530, 1989.

    Google Scholar 

  8. Goldberg, D., Deb, K., and Korb, B., “Messy genetic algorithms revisited: Studies in mixed size and scale,” Complex Systems, 4, 415–444, 1990.

    Google Scholar 

  9. Forsythe, G.E., Malcolm, M.A., and Moler, C.B., Computer methods for mathematical computations, Prentice-Hall, Inc., 1977.

    Google Scholar 

  10. Fahlman, S.E., “An empirical study of learning speed in back-propagation networks,” Carnegie Mellon University Technical Report CMU-CS-88-162, September 1988.

    Google Scholar 

  11. Rumelhart, D.E., Hinton, G.E., and Williams, R.J., “Learning internal representations by error propagation,” In: Parallel Distributed Processing, MIT Press, 1, 318–362, 1986.

    Google Scholar 

  12. Goldberg, D.E., Genetic algorithms in search optimization and machine learning, Reading, MA, Addison-Wesley, 1989.

    Google Scholar 

  13. Back, T., Hoffmeister, F., Schwefel H.-P., “A survey of evolutionary strategies,” Proceedings of the 4th International Conference on Genetic Algorithms and their Applications, edited by Belew R.K. and Booker L.B., pp. 2–9, 1991, Morgan Kaufmann, San Mateo, CA.

    Google Scholar 

  14. Gorman, R.P., and Sejnowski, T.J., “Analysis of hidden units in a layered network trained to classify sonar targets,” Neural Networks, 1, 75–89, 1988.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

V. W. Porto N. Saravanan D. Waagen A. E. Eiben

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Skourikhine, A.N. (1998). An evolutionary algorithm for designing feedforward neural networks. In: Porto, V.W., Saravanan, N., Waagen, D., Eiben, A.E. (eds) Evolutionary Programming VII. EP 1998. Lecture Notes in Computer Science, vol 1447. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0040814

Download citation

  • DOI: https://doi.org/10.1007/BFb0040814

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-64891-8

  • Online ISBN: 978-3-540-68515-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics