Skip to main content

Modified Error Function with Added Terms for the Backpropagation Algorithm

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3173))

Abstract

We have noted that the local minima problem in the backpropagation algorithm is usually caused by update disharmony between weights connected to the hidden layer and the output layer. To solve this problem, we propose a modified error function with added terms. By adding one term to the conventional error function, the modified error function can harmonize the update of weights connected to the hidden layer and the output layer. Thus, it can avoid the local minima problem caused by such disharmony. Moreover, some new learning parameters introduced for the added term are easy to select. Simulations on the modified XOR problem have been performed to test the validity of the modified error function.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hadjiprocopis, A.: Feed Forward Neural Network Entities. Ph.D. Thesis. City University London UK (2000)

    Google Scholar 

  2. Goerick, C., Seelen, W.V.: On Unlearnable Problems or A Model for Premature Saturation in Backpropagation Learning. In: Proceedings of the European Symposium on Artificial Neural Networks, Brugge Belgium, April 24-26, pp. 13–18 (1996)

    Google Scholar 

  3. Haykin, S.: Neural Networks, A Comprehensive Foundation. MacMillan Publishing, New York (1994)

    MATH  Google Scholar 

  4. Wessels, L.F.A., Barnard, E., van Rooyen, E.: The Physical Correlates of Local Minima. In: Proceedings of the International Neural Network Conference, Paris, July 1990, p. 985 (1990)

    Google Scholar 

  5. Funahashi, K.: On the Approximate Realization of Continuous Mapping by Neural Networks. Neural Networks 2, 183–192 (1989)

    Article  Google Scholar 

  6. Wang, X.G., Tang, Z., Tamura, H., Ishii, M.: A Modified Error Function for Backpropagation Algorithm. Neurocomputing 57, 477–484 (2004)

    Article  Google Scholar 

  7. Owen, C.B., Abunawass, A.M.: Application of Simulated Annealing to the Backpropagation Model Improves Convergence. In: Proceedings of the SPIE Conference on the Science of Artificial Neural Networks II, vol. 1966, pp. 269–276 (1993)

    Google Scholar 

  8. Gori, M., Tesi, A.: On the Problem of Local Minima in Backpropagation. IEEE Trans. Pattern Analysis and Machine Intelligence 14(1), 76–86 (1992)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bi, W., Wang, X., Zong, Z., Tang, Z. (2004). Modified Error Function with Added Terms for the Backpropagation Algorithm. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_57

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-28647-9_57

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22841-7

  • Online ISBN: 978-3-540-28647-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics