Skip to main content
Log in

Convergence properties of cascade correlation in function approximation

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

Incremental neural networks have received increasing interest in the neural computing field, especially in reducing training time. Among various emerging algorithms, cascade correlation has become widely used. This algorithm gives satisfactory results in many applications: the reason for this, however, is still an open problem. In this paper, we prove a theorem which guarantees that the cascade correlation algorithm converges. Moreover, we prove that it has at least a speed of convergence of order O(1/nh), where nh is the number of hidden neurons, when approximating a function consisting of a series of sigmoids with a finite number of terms. This guarantees that, in applications where the well- known backpropagation gives a good representation of the training data, cascade correlation is able to obtain very similar results, saving a lot of computer time, as experienced in practice. Computer simulation shows the capability of the cascade correlation algorithm implemented to obtain this convergence speed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Fahlman SE, Lebiere C. The cascade-correlation learning architecture. CMU-CS-90-100, February 14 1990

  2. Drago GP, Ridella S. Cascade correlation: an incremental tool for function approximation. Proc. ICANN 1993: 750–754

  3. Barron AR. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans. Infor. theory 1993; 39(3): 930–945

    Google Scholar 

  4. Jones LK. A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training. Annals Statistics 1992; 20(1): 608–613

    Google Scholar 

  5. Drago GP, Ridella S. A comparison between a SCAWI backpropagation and an improved cascade correlation. 6th Italian Workshop on Neural Nets, Wirn Vietri-93, 12–14 May 1993

  6. Jones LK. Construction approximations for neural networks by sigmoidal functions. Proc. IEEE 1990; 78(10): 1586–1589

    Google Scholar 

  7. Nguyen D, Widrow B. Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. Int. Joint Conf. on Neural Networks 1990; III/21-26

  8. Drago GP, Ridella S. Statistically controlled activation weight initialization. IEEE Trans. Neural Networks 1992; 3(4): 627–631

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Drago, G.P., Ridella, S. Convergence properties of cascade correlation in function approximation. Neural Comput & Applic 2, 142–147 (1994). https://doi.org/10.1007/BF01415010

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01415010

Keywords

Navigation