Skip to main content

Improvements to the Conventional Layer-by-Layer BP Algorithm

  • Conference paper
Advances in Intelligent Computing (ICIC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3645))

Included in the following conference series:

  • 1038 Accesses

Abstract

This paper points out some drawbacks and proposes some modifications to the conventional layer-by-layer BP algorithm. In particular, we present a new perspective to the learning rate, which is to use a heuristic rule to define the learning rate so as to update the weights. Meanwhile, to pull the algorithm out of saturation area and prevent it from converging to a local minimum, a momentum term is introduced to the former algorithm. And finally the effectiveness and efficiency of the proposed method are demonstrated by two benchmark examples.

This work was supported by the National Natural Science Foundation of China (Nos.60472111 and 60405002), and RGC Project No.CUHK 4170/04E, RGC Project No. CUHK4205/04E and UGC Project No.AoE/E-01/99.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Huang, D.S., Ip, H.H.S., Chi, Z.: A Neural Root Finder of Polynomials Based on Root Moments. Neural Computation 16(8), 1721–1762 (2004)

    Article  MATH  Google Scholar 

  2. Huang, D.-S., Ip, H.H.S., Law, K.C.K., Chi, Z.: Zeroing Polynomials Using Modified Constrained Neural Network Approach. IEEE Trans. on Neural Networks 16(3), 721–732 (2005)

    Article  Google Scholar 

  3. Huang, D.S.: A Constructive Approach for Finding Arbitrary Roots of Polynomials by Neural Networks. IEEE Transactions on Neural Networks 15(2), 477–491 (2004)

    Article  Google Scholar 

  4. Huang, D.S.: Systematic Theory of Neural Networks for Pattern Recognition. Publishing House of Electronic Industry of China, Beijing (1996)

    Google Scholar 

  5. Rumelhart, D.E., McClelland, J.L.: Parallel Distributed Processing. MIT Press, Cambridge (1986)

    Google Scholar 

  6. Huang, D.S., Ip, H.H.S., Chi, Z.: A Neural Root Finder of Polynomials Based on Root Moments. Neural Computation 16(8), 1721–1762 (2004)

    Article  MATH  Google Scholar 

  7. Huang, D.S.: The Local Minima Free Condition of Feedforward Neural Networks for Outer-supervised Learning. IEEE Trans. on Systems, Man and Cybernetics 28B(3), 477–480 (1998)

    Google Scholar 

  8. Huang, D.S.: Radial Basis Probabilistic Neural Networks: Model and Application. International Journal of Pattern Recognition and Artificial Intelligence 13(7), 1083–1101 (1999)

    Article  Google Scholar 

  9. Oh, S.-H., Lee, S.-Y.: A New Error Function at Hidden Layers for Fast Training of Multilayer Perceptrons. IEEE Trans. Neural Networks 10, 960–964 (1999)

    Article  Google Scholar 

  10. Ergezinger, S., Thomsen, E.: An Accelerated Learning Algorithm for Multilayer Pereceptrons: Optimization Layer by Layer. IEEE Trans. Neural Networks 6, 31–42 (1995)

    Article  Google Scholar 

  11. Wang, G.-J., Chen, C.-C.: A fast Multilayer Neural Networks Training Algorithm Based on the Layer-by-layer Optimizing Procedures. IEEE Trans. Neural Networks 7, 768–775 (1996)

    Article  Google Scholar 

  12. van Milligen, B.P., Tribaldos, V., Jimenez, J.A., Santa Cruz, C.: Comments on: An Accelerated Algorithm for Multilayer Perceptrons: Optimization Layer by Layer. IEEE Trans. Neural Networks 9, 339–341 (1998)

    Article  Google Scholar 

  13. van Ooyen, Nienhuis, B.: Improving the Convergence of the Backpropagation Algorithm. Neural Networks 5, 465–471 (1992)

    Article  Google Scholar 

  14. Huang, D.S.: The Bottleneck Behavior in Linear Feedforward Neural Network Classifiers and Their Breakthrough. Journal of Computer Science and Technology 14(1), 34–43 (1999)

    Article  MathSciNet  Google Scholar 

  15. Oh, S.-H.: Improving the Error Backpropagation Algorithm with a Modified Error Function. IEEE Trans. Neural Networks 8, 799–803 (1997)

    Article  Google Scholar 

  16. Hagen, M.T., Demuth, H.B.: Neural Network Design. PWS publishing company, USA (1996)

    Google Scholar 

  17. LeCun, Y.A., Bottou, L., Orr, G.B., Müller, K.-R.: Efficient Backprop. In: Orr, G.B., Müller, K.-R. (eds.) NIPS-WS 1996. LNCS, vol. 1524, pp. 9–50. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  18. Ham, F.M., Kostanic, I.: Principles of Neuaocomputing for Science and Engineering. McGraw-Hill Companies, Inc., USA (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, XQ., Han, F., Lok, TM., Lyu, M.R., Huang, GB. (2005). Improvements to the Conventional Layer-by-Layer BP Algorithm. In: Huang, DS., Zhang, XP., Huang, GB. (eds) Advances in Intelligent Computing. ICIC 2005. Lecture Notes in Computer Science, vol 3645. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11538356_20

Download citation

  • DOI: https://doi.org/10.1007/11538356_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28227-3

  • Online ISBN: 978-3-540-31907-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics