Abstract
The Eleman Neural Network has been widely used in various fields ranging from temporal version of the Exclusive-OR function to the discovery of syntactic categories in natural language date. However, one of the problems often associated with this type of network is the local minima problem which usually occurs in the process of the learning. To solve this problem, we have proposed an error function which can harmonize the update weights connected to the hidden layer and those connected to the output layer by adding one term to the conventional error function. It can avoid the local minima problem caused by this disharmony. We applied this method to the Boolean Series Prediction Questions problems to demonstrate its validity. The result shows that the proposed method can avoid the local minima problem and largely accelerate the speed of the convergence and get good results for the prediction tasks.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Elman, J.L.: Finding Structure in Time. Cognitive Science 14, 179–211 (1990)
Jordan, M.I.: Attractor Dynamics and Parallelism in a Connectionsist Sequential Machine. In: Proceedings of the 8th Conference on Cognitive Science, pp. 531–546 (1986)
Omlin, C.W., Giles, C.L.: Extraction of Rules from Dicrete-Time Recurrent Neural Networks. Neural Networks 9(1), 41–52 (1996)
Stagge, P., Sendhoff, B.: Organisation of Past States in Recurrent Neural Networks: Implicit Embedding. In: Mohammadian, M. (ed.) Computational Intelligence for Modelling, Control & Automation, pp. 21–27. IOS Press, Amsterdam (1999)
Pham, D.T., Liu, X.: Identification of Linear and Nonlinear Dynamic Systems Using Recurrent Neural Networks. Artificial Intelligence in Engineering 8, 90–97 (1993)
Smith, A.: Branch Prediction with Neural Networks: Hidden Layers and Recurrent Connections. Department of Computer Science University of California, San Diego La Jolla, CA 92307 (2004)
Cybenko, G.: Approximation by Superposition of a Sigmoid Function. Mathematics of Control, Signals, and Systems 2, 303–314 (1989)
Kwok, D.P., Wang, P., Zhou, K.: Process Identification Using a Modified Eleman Neural Network. In: International Symposium on Speech, Image Processing and Neural Networks, pp. 499–502 (1994)
Gao, X.Z., Gao, X.M., Ovaska, S.J.: A Modified Eleman Neural Network Model with Application to Dynamical Systems Identification. In: Proceedings of the IEEE International Conference on System, Man and Cybernetics, vol. 2, pp. 1376–1381 (1996)
Chagra, W., Abdennour, R.B., Bouani, F., Ksouri, M., Favier, G.: A Comparative Study on the Channel Modeling Using Feedforward and Recurrent Neural Network Structures. In: Proceedings of the IEEE International Conference on System, Man and Cybernetics, vol. 4, pp. 3759–3763 (1998)
Kalinli, A., Sagiroglu, S.: Eleman Network with Embedded Memory for System Identification. Journal of Informaiton Science and Engineering 22, 1555–1668 (2006)
Servan-Schreiber, C., Printz, H., Cohen, J.: A Network Model of Neuromodulatory Effects: Gain, Signal- to-Noise Ratio and Behavior. Science 249, 892–895 (1990)
Cybenko, G.: Approximation by Superposition of a Sigmoid Function. Mathematics of Control, Signals, and System 2, 303–314 (1989)
Wang, X., Tang, Z.: An Improved Backpropagation Algorithm to Avoid the Local Minima Problem. Neurocomputing 56, 455–460 (2004)
http://www.mathworks.com/access/helpdesk/help/helpdesk.shtml
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Zhang, Z., Tang, Z., Tang, G., Catherine, V., Wang, X., Xiong, R. (2007). An Improved Algorithm for Eleman Neural Network by Adding a Modified Error Function. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_56
Download citation
DOI: https://doi.org/10.1007/978-3-540-72393-6_56
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72392-9
Online ISBN: 978-3-540-72393-6
eBook Packages: Computer ScienceComputer Science (R0)