Skip to main content

A cascade network algorithm employing Progressive RPROP

  • Neural Nets Simulation, Emulation and Implementation
  • Conference paper
  • First Online:
Biological and Artificial Computation: From Neuroscience to Technology (IWANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1240))

Included in the following conference series:

Abstract

Cascade Correlation (Cascor) has proved to be a powerful method for training neural networks. Cascor, however, has been shown not to generalise well on regression and some classification problems. A new Cascade network algorithm employing Progressive RPROP (Casper), is proposed. Casper, like Cascor, is a constructive learning algorithm which builds cascade networks. Instead of using weight freezing and a correlation measure to install new neurons, however, Casper uses a variation of RPROP to train the whole network. Casper is shown to produce more compact networks, which generalise better than Cascor.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Adams, A., and Waugh, S. (1995) Function Evaluation and the Cascade-Correlation architecture In Proc. 1995 IEEE Int. Conf. Neural Networks. pp. 942–946.

    Google Scholar 

  • Fahlman, S.E. (1988) Faster learning variations on backpropagation: An empirical study. In Proc. 1988 Connectionist Models Summer School. San Mateo, CA: Morgan Kauffman

    Google Scholar 

  • Fahlman, S.E., and Lebiere, C. (1990) The cascade-correlation learning architecture. In Advances in Neural Information Processing II, Touretzky, Ed. San Mateo, CA: Morgan Kauffman, 1990, pp. 524–532.

    Google Scholar 

  • Hwang, J., Lay, S., Maechler, R. And Martin, D. (1994) Regression Modeling in Back-Propagation and Projection Pursuit Learning. IEEE Trans. Neural Networks vol. 5, no. 3. pp. 342–353.

    Google Scholar 

  • Hwang, J., You, S., Lay, S., and Jou, I. (1996) The Cascade-Correlation Learning: A Projection Pursuit Learning Perspective. IEEE Trans. Neural Networks vol. 7, no. 2. pp. 278–289.

    Google Scholar 

  • Kwok, T., and Yeung, D. (1993) Experimental Analysis of Input Weight Freezing in Constructive Neural Networks. In Proc. 1993 IEEE Int. Conf. Neural Networks. pp. 511–516.

    Google Scholar 

  • Riedmiller, M. and Braun, H. (1993) A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm. In: Ruspini, H., (Ed.) Proc. of the ICNN 93, San Francisco, pp. 586–591.

    Google Scholar 

  • Riedmiller, M. (1994) Rprop — Description and Implementation Details, Technical Report, University of Karlsruhe.

    Google Scholar 

  • Treadgold, N.K., and Gedeon, T.D. (1996) A Simulated Annealing Enhancement to Resilient Backpropagation. Proc. Int. Panel Conf. Soft and Intelligent Computing, Budapest pp. 293–298.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Roberto Moreno-Díaz Joan Cabestany

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Treadgold, N.K., Gedeon, T.D. (1997). A cascade network algorithm employing Progressive RPROP. In: Mira, J., Moreno-Díaz, R., Cabestany, J. (eds) Biological and Artificial Computation: From Neuroscience to Technology. IWANN 1997. Lecture Notes in Computer Science, vol 1240. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0032532

Download citation

  • DOI: https://doi.org/10.1007/BFb0032532

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63047-0

  • Online ISBN: 978-3-540-69074-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics