Abstract
Incremental neural networks have received increasing interest in the neural computing field, especially in reducing training time. Among various emerging algorithms, cascade correlation has become widely used. This algorithm gives satisfactory results in many applications: the reason for this, however, is still an open problem. In this paper, we prove a theorem which guarantees that the cascade correlation algorithm converges. Moreover, we prove that it has at least a speed of convergence of order O(1/nh), where nh is the number of hidden neurons, when approximating a function consisting of a series of sigmoids with a finite number of terms. This guarantees that, in applications where the well- known backpropagation gives a good representation of the training data, cascade correlation is able to obtain very similar results, saving a lot of computer time, as experienced in practice. Computer simulation shows the capability of the cascade correlation algorithm implemented to obtain this convergence speed.
Similar content being viewed by others
References
Fahlman SE, Lebiere C. The cascade-correlation learning architecture. CMU-CS-90-100, February 14 1990
Drago GP, Ridella S. Cascade correlation: an incremental tool for function approximation. Proc. ICANN 1993: 750–754
Barron AR. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans. Infor. theory 1993; 39(3): 930–945
Jones LK. A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training. Annals Statistics 1992; 20(1): 608–613
Drago GP, Ridella S. A comparison between a SCAWI backpropagation and an improved cascade correlation. 6th Italian Workshop on Neural Nets, Wirn Vietri-93, 12–14 May 1993
Jones LK. Construction approximations for neural networks by sigmoidal functions. Proc. IEEE 1990; 78(10): 1586–1589
Nguyen D, Widrow B. Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. Int. Joint Conf. on Neural Networks 1990; III/21-26
Drago GP, Ridella S. Statistically controlled activation weight initialization. IEEE Trans. Neural Networks 1992; 3(4): 627–631
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Drago, G.P., Ridella, S. Convergence properties of cascade correlation in function approximation. Neural Comput & Applic 2, 142–147 (1994). https://doi.org/10.1007/BF01415010
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF01415010