Abstract
We have noted that the local minima problem in the backpropagation algorithm is usually caused by update disharmony between weights connected to the hidden layer and the output layer. To solve this problem, we propose a modified error function with added terms. By adding one term to the conventional error function, the modified error function can harmonize the update of weights connected to the hidden layer and the output layer. Thus, it can avoid the local minima problem caused by such disharmony. Moreover, some new learning parameters introduced for the added term are easy to select. Simulations on the modified XOR problem have been performed to test the validity of the modified error function.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Hadjiprocopis, A.: Feed Forward Neural Network Entities. Ph.D. Thesis. City University London UK (2000)
Goerick, C., Seelen, W.V.: On Unlearnable Problems or A Model for Premature Saturation in Backpropagation Learning. In: Proceedings of the European Symposium on Artificial Neural Networks, Brugge Belgium, April 24-26, pp. 13–18 (1996)
Haykin, S.: Neural Networks, A Comprehensive Foundation. MacMillan Publishing, New York (1994)
Wessels, L.F.A., Barnard, E., van Rooyen, E.: The Physical Correlates of Local Minima. In: Proceedings of the International Neural Network Conference, Paris, July 1990, p. 985 (1990)
Funahashi, K.: On the Approximate Realization of Continuous Mapping by Neural Networks. Neural Networks 2, 183–192 (1989)
Wang, X.G., Tang, Z., Tamura, H., Ishii, M.: A Modified Error Function for Backpropagation Algorithm. Neurocomputing 57, 477–484 (2004)
Owen, C.B., Abunawass, A.M.: Application of Simulated Annealing to the Backpropagation Model Improves Convergence. In: Proceedings of the SPIE Conference on the Science of Artificial Neural Networks II, vol. 1966, pp. 269–276 (1993)
Gori, M., Tesi, A.: On the Problem of Local Minima in Backpropagation. IEEE Trans. Pattern Analysis and Machine Intelligence 14(1), 76–86 (1992)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bi, W., Wang, X., Zong, Z., Tang, Z. (2004). Modified Error Function with Added Terms for the Backpropagation Algorithm. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_57
Download citation
DOI: https://doi.org/10.1007/978-3-540-28647-9_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22841-7
Online ISBN: 978-3-540-28647-9
eBook Packages: Springer Book Archive