Skip to main content
Log in

Introducing evolving Takagi–Sugeno method based on local least squares support vector machine models

  • Original Paper
  • Published:
Evolving Systems Aims and scope Submit manuscript

Abstract

In this study, an efficient local online identification method based on the evolving Takagi–Sugeno least square support vector machine (eTS-LS-SVM) for nonlinear time series prediction is introduced. As an innovation, this paper has applied the nonlinear models, i.e. local LS-SVM models, as the consequence parts of the fuzzy rules, instead of the linear models used in the conventional evolving TS fuzzy models. In each step, the proposed learning approach includes two phases. The fuzzy rules (rule premise) are first created and updated adaptively based on a sequential clustering technique to obtain the structure of TS model. Then, the parameters of each local LS-SVM model (rule consequence) are recursively updated by deriving a new recursive algorithm (a local decremental and incremental procedure) to minimize the local modelling error and trace the process’s dynamics. Besides, a new learning algorithm based on the recursive gradient-based method is used to adaptively update the meta-parameters of the LS-SVM models. Comparison of the suggested method with some of the previous approaches based on the online prediction of the nonlinear time series has shown that the introduced identification algorithm has a proper performance in terms of learning and generalization abilities while having a lower redundancy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Abonyi J, Babuska R (2000) Local and global identification and interpretation of parameters in Takagi–Sugeno fuzzy models. In: Proceedings IEEE international conference on fuzzy systems, pp 835–840

  • An S, Liu W, Venkatesh S (2007) Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognit 40(8):2154–2162

    Article  MATH  Google Scholar 

  • Angelov P, Filev D (2004) An approach to online identification of Takagi–Sugeno fuzzy models. IEEE Trans Syst Man Cybern B 34(1):484–498

    Article  Google Scholar 

  • Angelov PP, Filev D (2006) Simple_ eTS: a simplified method for learning evolving Takagi–Sugeno fuzzy models. In: Proceedings of the 11th IEEE international conference on fuzzy systems, pp 1068–1073

  • Angelov PP, Zhou X (2006) Evolving fuzzy systems from data streams in real-time. In: IEEE symposium on evolving fuzzy systems, Ambleside, Lake District, UK, pp 29–35

  • Angelov PP, Filev D, Kasabov NK (2008) Evolving fuzzy systems—preface to the special section. IEEE Trans Fuzzy Syst 16(6):1390–1392

    Article  Google Scholar 

  • Cheng WY, Juang CF (2011) An incremental support vector machine-trained TS-type fuzzy system for online classification problems. Fuzzy Sets Syst 163(1):24–44

    Article  MathSciNet  MATH  Google Scholar 

  • Chi HM, Ersoy KO (2003) Recursive update algorithm for least squares support vector machines. Neural Process Lett 17:165–173

    Article  MATH  Google Scholar 

  • Diehl CP, Cauwenberghs G (2003) SVM incremental learning, adaptation and optimization. Proc Int Jt Conf Neural Netw Boston 4(1–4):2685–2690

    Article  Google Scholar 

  • Dovžan D, Škrjanc I (2011) Recursive clustering based on a Gustafson-Kessel algorithm. Evol Syst 2:15–24

    Article  Google Scholar 

  • Duda RO, Hart PE, Stork DG (2001) Pattern classification, 2nd edn. Wiley, New York

    MATH  Google Scholar 

  • Engel Y, Mannor S, Meir R (2004) The kernel recursive least-squares algorithm. IEEE Trans Signal Process 52(8):2275–2285

    Article  MathSciNet  Google Scholar 

  • Kalhor A, Araabi BN, Lucas C (2009) Online identification of a neuro-fuzzy model through indirect fuzzy clustering of data space. In: IEEE international conference on fuzzy systems, Korea

  • Kalhor A, Araabi BN, Lucas C (2010) An online predictor model as adaptive habitually linear and transiently nonlinear model. Evol Syst 1:29–41

    Article  Google Scholar 

  • Kasabov NK, Song Q (2002) DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Trans Fuzzy Syst 10(2):144–154

    Article  Google Scholar 

  • Kim CH, Kim MS, Lee JJ (2007) Incremental hyperplane-based fuzzy clustering for system modeling. In: Proceedings of 33rd conference of IEEE industrial electronics society, Taipei, Taiwan

  • de Kruif B, de Vries T (2003) Pruning error minimization in least squares support vector machines. IEEE Trans Neural Netw 14(3):696–702

    Article  Google Scholar 

  • Leuven KU, Suykens JAK, Lukas L (2000) Sparse least squares support vector machine classifiers, Suykens JAK, Lukas L and Vandewalle J. In: Neural processing letters, pp 293–300

  • Li L, Yu H, Liu J, Zhang S (2010) Local weighted LS-SVM online modeling and the application in continuous processes. In: Wang F, Deng H, Gao Y, sheng Lei J (eds) Artificial intelligence and computational intelligence. Lecture notes in computer science, vol 6320. Springer, Berlin, pp 209–217

  • Li LJ, Su HY, Chu J (2007) Generalized predictive control with online least squares support vector machines. Acta Autom Sin 33(11):1182–1188

    MathSciNet  MATH  Google Scholar 

  • Lin CJ, Chen CH, Lin CT (2011) An efficient evolutionary algorithm for fuzzy inference systems. Evol Syst 2:83–99

    Article  Google Scholar 

  • Liu W, Park I, Wang Y, Príncipe JC (2009) Extended kernel recursive least squares algorithm. IEEE Trans Signal Process 57(10):3801–3814

    Article  MathSciNet  Google Scholar 

  • Liu Y, Wang H, Li P (2007) Local least squares support vector regression with application to online modeling for batch processes. J Chem Ind Eng 58:2846–2851

    Google Scholar 

  • Liu Y, Wang H, Yu J, Li P (2010) Selective recursive kernel learning for online identification of nonlinear systems with NARX form. J Process Control 20(2):181–194

    Article  Google Scholar 

  • Lughofer E (2008) FLEXFIS: a robust incremental learning approach for evolving Takagi–Sugeno fuzzy models. IEEE Trans Fuzzy Syst 16(6):139–1410

    Article  Google Scholar 

  • Lughofer E, Klement E (2005) FLEXFIS: a variant for incremental learning of Takagi–Sugeno fuzzy systems. In: Proceedings of FUZZ-IEEE, Reno, Nevada, USA, pp 915–920

  • Lughofer E, Bouchot JL, Shaker A (2011) On-line elimination of local redundancies in evolving fuzzy systems. Evol Syst. doi:10.1007/s12530-011-9032-3

  • Maia C, Goncalves M (2009) A methodology for short-term electric load forecasting based on specialized recursive digital filters. Comput Ind Eng 57(3):724–731

    Article  Google Scholar 

  • Martinez B, Herrera F, Fernandez J, Marichal E (2008) An incremental clustering method and its application in online fuzzy modeling. Stud Fuzziness Soft Comput 224:163–178

    Article  Google Scholar 

  • Mirmomeni M, Lucas C, Araabi B, Moshiri B, Bidar M (2011) Online multi-step ahead prediction of time-varying solar and geomagnetic activity indices via adaptive neurofuzzy modeling and recursive spectral analysis. Sol Phys 272:189–213

    Article  Google Scholar 

  • Ngia LS, Sjóberg J, Viberg M (1998) Adaptive neural nets filter using a recursive Levenberg-Marquardt search direction. In: Proceedings of the 32nd asilomar conference on signals, systems and computers, pp 697–701

  • Pandian SC, Duraiswamy K, Rajan CCA, Kanagaraj N (2006) Fuzzy approach for short term load forecasting. Electr Power Syst Res 76(6–7):541–548

    Article  Google Scholar 

  • Pouzols F, Lendasse A (2010) Evolving fuzzy optimally pruned extreme learning machine for regression problems. Evol Syst 1:43–58

    Article  Google Scholar 

  • Ramos JV, Pereira C, Dourado A (2010) The building of interpretable systems in real-time. In: Angelov PP, Filev D, Kasabov N (eds) Evolving intelligent systems: methodology and applications. Wiley, New York, pp 127–150

  • Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14:199–222

    Article  MathSciNet  Google Scholar 

  • Soleimani-B H, Lucas C, Araabi BN (2010) Recursive Gath-Geva clustering as a basis for evolving neuro-fuzzy modeling. Evol Syst 1:59–71

    Article  Google Scholar 

  • Suykens J, Brabanter JD, Lukas L, Vandewalleb J (2002) Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48(1–4):85–105

    Article  MATH  Google Scholar 

  • Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9:293–300

    Article  MathSciNet  Google Scholar 

  • Tang HS, Xue ST, Chen R, Sato T (2006) Online weighted LS-SVM for hysteretic structural system identification. Eng Struct 28(12):1728–1735

    Article  Google Scholar 

  • Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    MATH  Google Scholar 

  • Yamauchi K (2010) Incremental model selection and ensemble prediction under virtual concept drifting environments. Evol Syst 6230:570–582

    Google Scholar 

  • Yongping Zhao JS (2009) Recursive reduced least squares support vector regression. Pattern Recognit 42:837–842

    Article  MATH  Google Scholar 

Download references

Acknowledgments

The authors would like to express their gratitude to Mr. Mojtaba Kharrasi for help in proof reading and text edition.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohammad Komijani.

Appendix

Appendix

Proof

First consider \(Y_{K }^i = [ 0\quad {y_k^i(1)} \ldots{y_k^i({N - 1})}]^T \) where (N − 1) denotes the size of moving-windows for the ith local model after the pruning stage, y k+1 is new added output then we can derive (36) as following:

$$\begin{aligned} \Uptheta_{k+1}^{i}&=P_{k+1}^{i}\left[\begin{array}{l} Y_{k }^i\\ y_{k+1} \end{array}\right]\\ &= \left[{\begin{array}{ll} {P_{pr}^{i}-\eta_{k+1}^{i}Z_{k+1}^{i} (Z_{k+1}^{i})^{T}} & {\eta_{k+1}^{i}Z_{k+1}^{i}}\\ {\zeta_{k+1}^{i}\psi_{k+1}^{i}\left[{\eta_{k+1}^{i} Z_{k+1}^{i} (Z_{k+1}^{i})^{T}-P_{N}}\right]}&{-\eta_{k+1}^{i}} \end{array}}\right]\left[{\begin{array}{l} {Y_{k}^{i}}\\ {y_{k+1}}\\ \end{array}}\right]\\ &=\left[{\begin{array}{l} P_{N} Y_{k}^{i}-\eta_{k+1}^{i} Z_{k+1}^{i} (Z_{k+1}^{i})^{T} Y_{k}^{i}+\eta_{k+1}^{i} Z_{k+1}^{i} y_{k+1}\\ \zeta_{k+1}^{i} \psi_{k+1}^{i}\eta_{k+1}^{i} Z_{k+1}^{i} (Z_{k+1} ^{i})^{T} Y_{k}^{i}-\zeta_{k+1}^{i} (Z_{k+1}^{i})^{T} Y_{k}^{i}- \eta_{k+1}^{i} y_{k+1} \end{array}} \right]\\ &=\left[{\begin{array}{l} \Uptheta_{pr}^{i}+\eta_{k+1}^{i} Z_{k+1}^{i} (y_{k+1}-(Z_{k+1}^{i})^{T} Y_{k}^{i})\\ (\zeta_{k+1}^{i}\eta_{k+1}^{i}({\eta_{k+1}^{i}})^{-1}-\zeta _{k+1}^{i})(Z_{k+1} ^{i})^{T} Y_{k}^{i}\\ \qquad \qquad\qquad-\eta_{k+1}^{i} (y_{k+1}-(Z_{k+1}^{i})^{T} Y_{k}^{i}) \end{array}}\right]\\ &=\left[{\begin{array}{l}\Uptheta_{pr}^{i}+\eta_{k+1}^{i} Z_{k+1}^{i} e_{k+1}^{i}\\ (\zeta_{k+1}-\zeta_{k+1})(Z_{k+1}^{i})^{T} Y_{k}^{i}-\eta_{k+1}e_{k+1}^{i} \end{array}}\right]\\ &=\left[{\begin{array}{l} \Uptheta_{pr}^{i}+\eta_{k+1}^{i}Z_{k+1}^{i} e_{k+1}^{i}\\ {-\eta_{k+1}^{i} e_{k+1}^{i}} \end{array}} \right] \end{aligned} $$
(49)

where e i k+1 is the prediction error computed by the difference between the desired signal and the output of the ith local model after the pruning stage (Dovžan and Škrjanc 2011).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Komijani, M., Lucas, C., Araabi, B.N. et al. Introducing evolving Takagi–Sugeno method based on local least squares support vector machine models. Evolving Systems 3, 81–93 (2012). https://doi.org/10.1007/s12530-011-9043-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12530-011-9043-0

Keywords

Navigation