Skip to main content

An Adaptive Recursive Least Square Algorithm for Feed Forward Neural Network and Its Application

  • Conference paper
Book cover Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence (ICIC 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4682))

Included in the following conference series:

Abstract

In high dimension data fitting, it is difficult task to insert new training samples and remove old-fashioned samples for feed forward neural network (FFNN). This paper, therefore, studies dynamical learning algorithms with adaptive recursive regression (AR) and presents an advanced adaptive recursive (AAR) least square algorithm. This algorithm can efficiently handle new samples inserting and old samples removing. This AAR algorithm is applied to train FFNN and makes FFNN be capable of simultaneously implementing three processes of new samples dynamical learning, old-fashioned samples removing and neural network (NN) synchronization computing. It efficiently solves the problem of dynamically training of FFNN. This FFNN algorithm is carried out to compute residual oil distribution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hoffmann, M., Kovács, E.: Developable Surface Modeling by Neural Network. Mathematical and Computer Modelling 38, 849–853 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  2. Hoffmann, M., Kohonen,: Neural Network for Surface Reconstruction. Publ. Math 54 Suppl, 857–864 (1999)

    Google Scholar 

  3. Yu, Y.: Surface Reconstruction from Unorganized Points using Self-organizing Neural Networks. In: IEEE Visualization 99,Conference Proceedings, pp. 61–64. IEEE Computer Society Press, Los Alamitos (1999)

    Google Scholar 

  4. Várady, L., Hoffmann, M., Kovács, E.: Improved Free-form Modelling of Scattered Data by Dynamic Neural Networks. Journal for Geom. and Graph 3, 177–183 (1999)

    MATH  Google Scholar 

  5. Wu, A., Hsieh, W.W., Tang, B.: Neural Network Forecasts of the Tropical Pacific Sea Surface Temperatures. Neural Networks: the Official Journal of the International Neural Network Society 19(2), 145–154 (2006)

    Google Scholar 

  6. Zadeh, L.A.: From Circuit Theory to System Theory. Proc. IRE 50(5), 856–865 (1962)

    Article  MathSciNet  Google Scholar 

  7. Eykhoff, P.: System Identification - Parameter and State Estimation. John Wiley & Sons, INC, Chichester (1974)

    Google Scholar 

  8. Palmieri, F., et al.: Sound Localization with a Neural Network Trained with the Multiple Extended Kalmann Algorithm. In: Proc IJCNN, pp. 125–131 (1991)

    Google Scholar 

  9. Azimi-Sadjadi, M.R., Liou, R.J.: Fast Learning Process of Multi-Layer Neural Networks Using RLS Technique. IEEE Trans. on Signal Processing SP-40(2), 446–450 (1992)

    Article  Google Scholar 

  10. Shah, S., Palmieri, F., Datum, M.: Optimal Filtering Algorithms for Fast Learning in Feedforward Neural Networks. Journal for Neural Networks 5(5), 779–787 (1992)

    Article  Google Scholar 

  11. Li, A.G., Qin, Z.: Moving Windows Quadratic Autoregressive Model for Predicting Nonlinear Time Series. Chinese Journal of Computers 27(7), 1004–1008 (2004)

    Google Scholar 

  12. Amenta, N., Bern, M., Kamvysselis, M.: A New Voronoi-based Surface Reconstruction Algorithm. In: SIGGRAPH 98, Conference Proceedings, pp. 415–422 (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

De-Shuang Huang Laurent Heutte Marco Loog

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Qing, Xh., Xu, Jy., Guo, Fh., Feng, Am., Nin, W., Tao, Hx. (2007). An Adaptive Recursive Least Square Algorithm for Feed Forward Neural Network and Its Application. In: Huang, DS., Heutte, L., Loog, M. (eds) Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence. ICIC 2007. Lecture Notes in Computer Science(), vol 4682. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74205-0_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74205-0_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74201-2

  • Online ISBN: 978-3-540-74205-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics