Skip to main content
Log in

Constructive Backpropagation for Recurrent Networks

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Choosing a network size is a difficult problem in neural network modelling. In many recent studies, constructive or destructive methods that add or delete connections, neurons or layers have been studied in order to solve this problem. In this work we consider the constructive approach, which is in many cases a very computationally efficient approach. In particular, we address the construction of recurrent networks by the use of constructive backpropagation. The benefits of the proposed scheme are firstly that fully recurrent networks with an arbitrary number of layers can be constructed efficiently. Secondly, after the network has been constructed we can continue the adaptation of the network weights as well as we can of its structure. This includes both addition and deletion of neurons/layers in a computationally efficient manner. Thus, the investigated method is very flexible compared to many previous methods. In addition, according to our time series prediction experiments, the proposed method is competitive in terms of modelling performance and training time compared to the well-known recurrent cascade-correlation method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. S. Geman, E. Bienenstock and R. Doursat, “Neural networks and the bias/variance dilemma, ” Neural Computation, Vol. 4, pp. 1–58, 1992.

    Google Scholar 

  2. R. Reed, “Pruning algorithms - a survey, ” IEEE Transactions on Neural Networks, Vol. 4, No. 5, pp. 740–747, 1993.

    Google Scholar 

  3. T. Kwok and D. Yeung, “Constructive algorithms for structure learning in feedforward neural networks for regression problems, ” IEEE Transactions on Neural Networks, Vol. 8, No. 3, pp. 630–645, 1997.

    Google Scholar 

  4. A. Weigend and N. Gershenfeld (eds), Time Series Prediction: Forecasting the Future and Understanding the Past, Addison-Wesley: Reading, 1994.

    Google Scholar 

  5. P. Angeline, G. Saunders and J. Pollack, “An evolutionary algorithm that constructs recurrent neural networks, ” IEEE Transactions on Neural Networks, Vol. 5, No. 1, pp. 54–65, 1994.

    Google Scholar 

  6. D. Obradovic, “On-line training of recurrent neural networks with continuous topology adaptation, ” IEEE Transactions on Neural Networks, Vol. 7, No. 1, pp. 222–228, 1996.

    Google Scholar 

  7. L. Leerink and M. Jabri, “Constructive learning using internal representation conflicts, ” Advances in Neural Information Processing Systems, Vol. 6, pp. 279–284, 1993.

    Google Scholar 

  8. T. Cholewo and J. Zurada, “Sequential network construction for time series prediction, ” Proceedings of International Conference on Neural Networks, Vol. 4, pp. 2034–2038, 1997.

    Google Scholar 

  9. S. Fahlman, “The recurrent cascade-correlation architecture, ” Technical Report CMU-CS-91- 100, 1991.

  10. C. Giles, D. Chen, G. Sun, H. Chen, Y. Lee and M. Goudreau, “Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution, ” IEEE Transactions on Neural Networks, Vol. 6, No. 4, pp. 829–836, 1995.

    Google Scholar 

  11. M. Lehtokangas, “Learning with constructive backpropagation, ” Proceedings of 5th International Conference on Soft Computing and Information/Intelligent Systems, Vol. 1, pp. 183–186, 1998.

    Google Scholar 

  12. S. Fahlman and C. Lebiere, “The cascade-correlation learning architecture, ” Advances in Neural Information Processing Systems 2, pp. 524–532, 1989.

    Google Scholar 

  13. M. Lehtokangas, “Extended constructive backpropagation for time series modelling, ” Proceedings of 5th International Conference on Soft Computing and Information/Intelligent Systems, Vol. 2, pp. 793–796, 1998.

    Google Scholar 

  14. J. Elman, “Finding structure in time, ” Cognitive Science, Vol. 14, pp. 179–211, 1990.

    Google Scholar 

  15. J. Thompson and H. Stewart, Nonlinear Dynamics and Chaos: Geometrical Methods for Engineers and Scientists, Wiley, 1986.

  16. M. Riedmiller and H. Braun, “A direct adaptive method for faster backpropagation learning: the RPROP algorithm, ” Proceedings of the IEEE International Conference on Neural Networks, ICNN'93, San Francisco, CA, pp. 586–591, 1993.

  17. R. Williams and Z. Zipser, “A learning algorithm for continually running fully recurrent neural networks, ” Neural Computation, Vol. 1, pp. 270–280, 1989.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lehtokangas, M. Constructive Backpropagation for Recurrent Networks. Neural Processing Letters 9, 271–278 (1999). https://doi.org/10.1023/A:1018620424763

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1018620424763

Navigation