Abstract
In high dimension data fitting, it is difficult task to insert new training samples and remove old-fashioned samples for feed forward neural network (FFNN). This paper, therefore, studies dynamical learning algorithms with adaptive recursive regression (AR) and presents an advanced adaptive recursive (AAR) least square algorithm. This algorithm can efficiently handle new samples inserting and old samples removing. This AAR algorithm is applied to train FFNN and makes FFNN be capable of simultaneously implementing three processes of new samples dynamical learning, old-fashioned samples removing and neural network (NN) synchronization computing. It efficiently solves the problem of dynamically training of FFNN. This FFNN algorithm is carried out to compute residual oil distribution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Hoffmann, M., Kovács, E.: Developable Surface Modeling by Neural Network. Mathematical and Computer Modelling 38, 849–853 (2003)
Hoffmann, M., Kohonen,: Neural Network for Surface Reconstruction. Publ. Math 54 Suppl, 857–864 (1999)
Yu, Y.: Surface Reconstruction from Unorganized Points using Self-organizing Neural Networks. In: IEEE Visualization 99,Conference Proceedings, pp. 61–64. IEEE Computer Society Press, Los Alamitos (1999)
Várady, L., Hoffmann, M., Kovács, E.: Improved Free-form Modelling of Scattered Data by Dynamic Neural Networks. Journal for Geom. and Graph 3, 177–183 (1999)
Wu, A., Hsieh, W.W., Tang, B.: Neural Network Forecasts of the Tropical Pacific Sea Surface Temperatures. Neural Networks: the Official Journal of the International Neural Network Society 19(2), 145–154 (2006)
Zadeh, L.A.: From Circuit Theory to System Theory. Proc. IRE 50(5), 856–865 (1962)
Eykhoff, P.: System Identification - Parameter and State Estimation. John Wiley & Sons, INC, Chichester (1974)
Palmieri, F., et al.: Sound Localization with a Neural Network Trained with the Multiple Extended Kalmann Algorithm. In: Proc IJCNN, pp. 125–131 (1991)
Azimi-Sadjadi, M.R., Liou, R.J.: Fast Learning Process of Multi-Layer Neural Networks Using RLS Technique. IEEE Trans. on Signal Processing SP-40(2), 446–450 (1992)
Shah, S., Palmieri, F., Datum, M.: Optimal Filtering Algorithms for Fast Learning in Feedforward Neural Networks. Journal for Neural Networks 5(5), 779–787 (1992)
Li, A.G., Qin, Z.: Moving Windows Quadratic Autoregressive Model for Predicting Nonlinear Time Series. Chinese Journal of Computers 27(7), 1004–1008 (2004)
Amenta, N., Bern, M., Kamvysselis, M.: A New Voronoi-based Surface Reconstruction Algorithm. In: SIGGRAPH 98, Conference Proceedings, pp. 415–422 (1998)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Qing, Xh., Xu, Jy., Guo, Fh., Feng, Am., Nin, W., Tao, Hx. (2007). An Adaptive Recursive Least Square Algorithm for Feed Forward Neural Network and Its Application. In: Huang, DS., Heutte, L., Loog, M. (eds) Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence. ICIC 2007. Lecture Notes in Computer Science(), vol 4682. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74205-0_35
Download citation
DOI: https://doi.org/10.1007/978-3-540-74205-0_35
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74201-2
Online ISBN: 978-3-540-74205-0
eBook Packages: Computer ScienceComputer Science (R0)