Abstract
Huang et al. (2004) has recently proposed an on-line sequential ELM (OS-ELM) that enables the extreme learning machine (ELM) to train data one-by-one as well as chunk-by-chunk. OS-ELM is based on recursive least squares-type algorithm that uses a constant forgetting factor. In OS-ELM, the parameters of the hidden nodes are randomly selected and the output weights are determined based on the sequentially arriving data. However, OS-ELM using a constant forgetting factor cannot provide satisfactory performance in time-varying or nonstationary environments. Therefore, we propose an algorithm for the OS-ELM with an adaptive forgetting factor that maintains good performance in time-varying or nonstationary environments. The proposed algorithm has the following advantages: (1) the proposed adaptive forgetting factor requires minimal additional complexity of O(N) where N is the number of hidden neurons, and (2) the proposed algorithm with the adaptive forgetting factor is comparable with the conventional OS-ELM with an optimal forgetting factor.
Similar content being viewed by others
References
Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the international joint conference on neural networks, Budapest, pp 25–29
Park J, Sandberg IW (1991) Universal approximation using radial-basis-function networks. Neural Comput 3:246–257
Barron AR (1993) Universal approximation bounds for superpositions of a sigmoid function. IEEE Trans Inf Theory 39:930–945
Leshno M, Lin VY, Pinkus A, Schocken S (1993) Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw 6:861–867
Huang G-B, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Lean Cybern 2:107–122
Lowe D (1989) Adaptive radial basis function nonlinearities and the problem of generalisation. In: Proceedings of the first IEE international conference on artificial neural networks, London, pp 171–175
Igelnik B, Pao Y-H (1995) Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans Neural Netw 6:1320–1329
Baum E (1988) On the capabilities of multilayer perceptrons. J Complex 4:193–215
Ferrari S, Stengel RF (2005) Smooth function approximation using neural networks. IEEE Trans Neural Netw 16:24–38
Huang G-B, Li M-B, Chen L, Siew C-K (2008) Incremental extreme learning machine with fully complex hidden nodes. Neurocomputing 71:576–583
ELM web portal. http://www.ntu.edu.sg/home/egbhuang
Lim J, Jeon J, Lee S (2006) Recursive complex extreme learning machine with widely linear processing for nonlinear channel equalizer. LNCS 3973:128–134
Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17:1411–1423
Huang G-B, Zhu Q-Y, Mao KZ, Siew C-K, Saratchandran P, Sundararajan N (2006) Can threshold networks be trained directly? IEEE Trans Circuits Syst II Exp Briefs 53:187–191
Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70:1–3
Huang G-B, Chen L, Siew C-K (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17:879–892
Haykin S (2002) Adaptive filter theory, 4th edn. Prentice Hall, NJ
Song S, Sung KM (2007) Reduced complexity self-tuning adaptive algorithms in application to channel estimation. IEEE Trans Commun 55:1448–1452
Lee S, Lim J, Sung K-M (2009) A low-complexity AFF-RLS algorithm using a normalization technique. IEICE Electron Exp 6:1774–1780
Niedzwiecki M (2000) Idenrification of time-varying process. Wiley, West Sussex
Paleologu C, Benesty J, Ciochina S (2008) A robust variable forgetting factor recursive least-squares algorithm for system identification. IEEE Signal Process Lett 15:597–600
Tuan P, Lee S, Hou W (1997) An efficient on-line thermal input estimation method using Kalman filter and recursive least square algorithm. Inverse Probl Eng 5:309–333
Kim H-S, Lim J-S, Baek S, Sung K-M (2001) Robust Kalman filtering with variable forgetting factor against impulsive noise. IEICE Trans Fundam E84-A:363–366
Yang B (1995) Projection approximation subspace tracking. IEEE Trans Signal Process 43:95–107
Lee K, Gan W, Kuo S (2009) Subband adaptive filtering theory and implementation. Wiley, West Sussex
Huang G-B, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70:3056–3062
Han K, Lee S, Lim J, Sung K (2004) Channel estimation for OFDM with fast fading channels by modified Kalman filter. IEEE Trans Consumer Electron 50:443–449
Rappaport T (1996) Wireless communications principles and practice. Prentice Hall, NJ
Adali T, Liu X (1997) Canonical piecewise linear network for nonlinear filtering and its application to blind equalization. Signal Process 61:145–155
Holland PW, Welch RE (1997) Robust regression using iterative reweighted least squares. Commun Stat Theory Methods A 6:813–827
Acknowledgments
This work was supported by the National Research Foundation of Korea(NRF) grant funded.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Lim, Js., Lee, S. & Pang, HS. Low complexity adaptive forgetting factor for online sequential extreme learning machine (OS-ELM) for application to nonstationary system estimations. Neural Comput & Applic 22, 569–576 (2013). https://doi.org/10.1007/s00521-012-0873-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-012-0873-x