Skip to main content

An Improved Algorithm for Eleman Neural Network by Adding a Modified Error Function

  • Conference paper
  • 1276 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4492))

Abstract

The Eleman Neural Network has been widely used in various fields ranging from temporal version of the Exclusive-OR function to the discovery of syntactic categories in natural language date. However, one of the problems often associated with this type of network is the local minima problem which usually occurs in the process of the learning. To solve this problem, we have proposed an error function which can harmonize the update weights connected to the hidden layer and those connected to the output layer by adding one term to the conventional error function. It can avoid the local minima problem caused by this disharmony. We applied this method to the Boolean Series Prediction Questions problems to demonstrate its validity. The result shows that the proposed method can avoid the local minima problem and largely accelerate the speed of the convergence and get good results for the prediction tasks.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Elman, J.L.: Finding Structure in Time. Cognitive Science 14, 179–211 (1990)

    Article  Google Scholar 

  2. Jordan, M.I.: Attractor Dynamics and Parallelism in a Connectionsist Sequential Machine. In: Proceedings of the 8th Conference on Cognitive Science, pp. 531–546 (1986)

    Google Scholar 

  3. Omlin, C.W., Giles, C.L.: Extraction of Rules from Dicrete-Time Recurrent Neural Networks. Neural Networks 9(1), 41–52 (1996)

    Article  Google Scholar 

  4. Stagge, P., Sendhoff, B.: Organisation of Past States in Recurrent Neural Networks: Implicit Embedding. In: Mohammadian, M. (ed.) Computational Intelligence for Modelling, Control & Automation, pp. 21–27. IOS Press, Amsterdam (1999)

    Google Scholar 

  5. Pham, D.T., Liu, X.: Identification of Linear and Nonlinear Dynamic Systems Using Recurrent Neural Networks. Artificial Intelligence in Engineering 8, 90–97 (1993)

    Google Scholar 

  6. Smith, A.: Branch Prediction with Neural Networks: Hidden Layers and Recurrent Connections. Department of Computer Science University of California, San Diego La Jolla, CA 92307 (2004)

    Google Scholar 

  7. Cybenko, G.: Approximation by Superposition of a Sigmoid Function. Mathematics of Control, Signals, and Systems 2, 303–314 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  8. Kwok, D.P., Wang, P., Zhou, K.: Process Identification Using a Modified Eleman Neural Network. In: International Symposium on Speech, Image Processing and Neural Networks, pp. 499–502 (1994)

    Google Scholar 

  9. Gao, X.Z., Gao, X.M., Ovaska, S.J.: A Modified Eleman Neural Network Model with Application to Dynamical Systems Identification. In: Proceedings of the IEEE International Conference on System, Man and Cybernetics, vol. 2, pp. 1376–1381 (1996)

    Google Scholar 

  10. Chagra, W., Abdennour, R.B., Bouani, F., Ksouri, M., Favier, G.: A Comparative Study on the Channel Modeling Using Feedforward and Recurrent Neural Network Structures. In: Proceedings of the IEEE International Conference on System, Man and Cybernetics, vol. 4, pp. 3759–3763 (1998)

    Google Scholar 

  11. Kalinli, A., Sagiroglu, S.: Eleman Network with Embedded Memory for System Identification. Journal of Informaiton Science and Engineering 22, 1555–1668 (2006)

    Google Scholar 

  12. Servan-Schreiber, C., Printz, H., Cohen, J.: A Network Model of Neuromodulatory Effects: Gain, Signal- to-Noise Ratio and Behavior. Science 249, 892–895 (1990)

    Article  Google Scholar 

  13. Cybenko, G.: Approximation by Superposition of a Sigmoid Function. Mathematics of Control, Signals, and System 2, 303–314 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  14. Wang, X., Tang, Z.: An Improved Backpropagation Algorithm to Avoid the Local Minima Problem. Neurocomputing 56, 455–460 (2004)

    Article  Google Scholar 

  15. http://www.mathworks.com/access/helpdesk/help/helpdesk.shtml

Download references

Author information

Authors and Affiliations

Authors

Editor information

Derong Liu Shumin Fei Zengguang Hou Huaguang Zhang Changyin Sun

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Zhang, Z., Tang, Z., Tang, G., Catherine, V., Wang, X., Xiong, R. (2007). An Improved Algorithm for Eleman Neural Network by Adding a Modified Error Function. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_56

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72393-6_56

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72392-9

  • Online ISBN: 978-3-540-72393-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics