Skip to main content

Addressing the Local Minima Problem by Output Monitoring and Modification Algorithms

  • Conference paper
Advances in Neural Networks – ISNN 2012 (ISNN 2012)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7367))

Included in the following conference series:

Abstract

This paper proposes a new approach called output monitoring and modification (OMM) to address the local minimum problem for existing gradient-descent algorithms (like BP, Rprop and Quickprop) in training feed-forward neural networks. OMM monitors the learning process. When the learning process is trapped into a local minimum, OMM changes some incorrect output values to escape from such local minimum. This modification can be repeated with different parameter settings until the learning process converges to the global optimum. The simulation experiments show that a gradient-descent learning algorithm with OMM has a much better global convergence capability than those without OMM but their convergence rates are similar. In one benchmark problem (application), the global convergence capability was increased from 1% to 100%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Parallel Distributed Processing: Exploration in the Microstructure of Cognition, vol. 1, MIT Press, Cambridge (1986)

    Google Scholar 

  2. Van Ooyen, A., Nienhuis, B.: Improving the convergence of the back-propagation algorithm. Neural Networks 5, 465–471

    Google Scholar 

  3. Vitela, J.E., Reifman, J.: Premature Saturation in Backpropagation Networks: Mechanism and Necessary Conditions. Neural Networks 10(4), 721–735 (1997)

    Article  Google Scholar 

  4. Blum, E.K., Li, L.K.: Approximation theory and feedforward networks. Neural Networks 4, 511–515 (1991)

    Article  Google Scholar 

  5. Gori, M., Tesi, A.: On the problem of local minima in back-propagation. IEEE Trans. On Pattern Analysis and Machine Intelligence 14(1), 76–86 (1992)

    Article  Google Scholar 

  6. Fahlman, S.E.: Fast learning variations on back-propagation: An empirical study. In: Touretzky, D., Hinton, G., Sejnowski, T. (eds.) Proc. the 1988 Connectionist Models Summer School, Pittsburgh, pp. 38–51 (1989)

    Google Scholar 

  7. Riedmiller, M., Braun, H.: A direct adaptive method for faster back-propagation learning: The RPROP Algorithm. In: Proc. of Int. Conf. on Neural Networks, vol. 1, pp. 586–591 (1993)

    Google Scholar 

  8. Yuceturk, A.C., Herdağdelen, A., Uyanik, K.: A solution to the problem of local minima in backpropagation algorithm. In: Proc. of the Fourteenth International Symposium on Computer and Information Sciences, Kusadasi, Turkey, pp. 1081–1083 (1999)

    Google Scholar 

  9. Wang, X.G., Tang, Z., Tamura, H., Ishii, M., Sun, W.D.: An improved backpropagation algorithm to avoid the local minima problem. Neurocomputing 56, 455–460 (2004)

    Article  Google Scholar 

  10. Ng, S.C., Cheung, C.C., Leung, S.H.: Magnified Gradient Function with Deterministic Weight Evolution in Adaptive Learning. IEEE Trans. on Neural Networks 15(6), 1411–1423 (2004)

    Article  Google Scholar 

  11. Frank, A., Asuncion, A.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA (2012), http://archive.ics.uci.edu/ml/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ng, SC., Cheung, CC., Lui, A.kf., Tse, HT. (2012). Addressing the Local Minima Problem by Output Monitoring and Modification Algorithms. In: Wang, J., Yen, G.G., Polycarpou, M.M. (eds) Advances in Neural Networks – ISNN 2012. ISNN 2012. Lecture Notes in Computer Science, vol 7367. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31346-2_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31346-2_24

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31345-5

  • Online ISBN: 978-3-642-31346-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics