Skip to main content

The Local True Weight Decay Recursive Least Square Algorithm

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4984))

Abstract

The true weight decay recursive least square (TWDRLS) algorithm is an efficient fast online training algorithm for feedforward neural networks. However, its computational and space complexities are very large. This paper first presents a set of more compact TWDRLS equations. Afterwards, we propose a local version of TWDRLS to reduce the computational and space complexities. The effectiveness of this local version is demonstrated by simulations. Our analysis shows that the computational and space complexities of the local TWDRLS are much smaller than those of the global TWDRLS.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Shah, S., Palmieri, F., Datum, M.: Optimal filtering algorithm for fast learning in feedforward neural networks. Neural Networks 5, 779–787 (1992)

    Article  Google Scholar 

  2. Leung, C.S., Wong, K.W., Sum, J., Chan, L.W.: A pruning method for recursive least square algorithm. Neural Networks 14, 147–174 (2001)

    Article  Google Scholar 

  3. Scalero, R., Tepedelelenlioglu, N.: Fast new algorithm for training feedforward neural networks. IEEE Trans. Signal Processing 40, 202–210 (1992)

    Article  Google Scholar 

  4. Leung, C.S., Sum, J., Young, G., Kan, W.K.: On the kalman filtering method in neural networks training and pruning. IEEE Trans. Neural Networks 10, 161–165 (1999)

    Article  Google Scholar 

  5. Leung, C.S., Tsoi, A.H., Chan, L.W.: Two regularizers for recursive least squared algorithms in feedforward multilayered neural networks. IEEE Trans. Neural Networks 12, 1314–1332 (2001)

    Article  Google Scholar 

  6. Mosca, E.: Optimal Predictive and adaptive control. Prentice-Hall, Englewood Cliffs, NJ (1995)

    Google Scholar 

  7. Haykin, S.: Adaptive filter theory. Prentice-Hall, Englewood Cliffs, NJ (1991)

    MATH  Google Scholar 

  8. Mackay, D.: Bayesian interpolation. Neural Computation 4, 415–447 (1992)

    Article  Google Scholar 

  9. Mackay, D.: A practical bayesian framework for backpropagation networks. Neural Computation 4, 448–472 (1992)

    Article  Google Scholar 

  10. William H, H.: Applied numerical linear algebra. Prentice-Hall, Englewood Cliffs, NJ (1989)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Masumi Ishikawa Kenji Doya Hiroyuki Miyamoto Takeshi Yamakawa

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Leung, C.S., Wong, KW., Xu, Y. (2008). The Local True Weight Decay Recursive Least Square Algorithm. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4984. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69158-7_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69158-7_48

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69154-9

  • Online ISBN: 978-3-540-69158-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics