Skip to main content
Log in

Ensemble OS-ELM based on combination weight for data stream classification

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

For online classification, how to design a self-adapted model is a challenging task. To make the model easily adaptable for the fast-changing data stream, a novel ensemble OS-ELM has been put forward. Different from traditional ensemble methods, the proposed approach provided a new self-adapted weight update algorithm. In online learning stage, both the current prediction accuracy and history record are considered. Based on suffer loss and the norm of output layer vector, an aggregate model of game theory is adopted to calculate the combination weight. This strategy fully considers the differences of individual learners. It helps the ensemble method reduce the fitting error of sequence fragment. Also, alterative hidden-layer output matrix can be calculated according to the current fragment, thus building the steady network architecture in the next chunk. So interactive parameter optimization is avoided and the automatic model is suitable for online learning. Numerical experiments are conducted on eight different kinds of UCI datasets. The results demonstrate that the proposed algorithm not only has better generalisation performance but also provides faster learning procedure.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

References

  1. Al-Yaseen WL, Othman ZA, Nazri MZA (2017) Real-time multi-agent system for an adaptive intrusion detection system[J]. Pattern Recogn Lett 85:56–64

    Article  Google Scholar 

  2. Barushka, Aliaksandr, and Petr Hajek. Spam filtering using integrated distribution-based balancing approach and regularized deep neural networks[J]. Appl Intell, 2018: 1–19

  3. Popov MA, Alpert SI, Podorvan VN (2017) Satellite image classification method using the Dempster-Shafer approach[J]. Izvestiya, atmospheric and oceanic. Physics 53(9):1112–1122

    Google Scholar 

  4. Hu W, Yan L, Liu K et al (2016) A short-term traffic flow forecasting method based on the hybrid PSO-SVR[J]. Neural Process Lett 43(1):155–172

    Article  Google Scholar 

  5. Zhang Q, Zhang P, Long G et al (2016) Online learning from trapezoidal data streams[J]. IEEE Trans Knowl Data Eng 28(10):2709–2723

    Article  MathSciNet  Google Scholar 

  6. Xin J, Wang Z, Qu L et al (2015) Elastic extreme learning machine for big data classification. Neurocomputing 149:464–471

    Article  Google Scholar 

  7. Mairal, Julien, et al. Online learning for matrix factorization and sparse coding[J]. J Mach Learn Res, 2010,11 (1): 19–60

  8. Li K, Kong X, Lu Z, Wenyin L, Yin J (2014) Boosting weighted ELM for imbalanced learning. Neurocomputing 128:15–21

    Article  Google Scholar 

  9. Huang G, Huang GB, Song S et al (2015) Trends in extreme learning machines: a review[J]. Neural Netw 61:32–48

    Article  MATH  Google Scholar 

  10. Liang NY, Huang GB, Saratchandran P et al (2006) A fast and accurate online sequential learning algorithm for feedforward networks[J]. IEEE Trans Neural Netw 17(6):1411–1423

    Article  Google Scholar 

  11. Savitha R, Suresh S, Kim HJ (2014) A meta-cognitive learning algorithm for an extreme learning machine classifier[J]. Cogn Comput 6(2):253–263

    Article  Google Scholar 

  12. Zhang T, Dai Q (2016) Hybrid ensemble selection algorithm incorporating GRASP with path relinking[J]. Appl Intell 44(3):704–724

    Article  Google Scholar 

  13. Mirza B, Lin Z (2016) Meta-cognitive online sequential extreme learning machine for imbalanced and concept-drifting data classification[J]. Neural Netw 80:79–94

    Article  Google Scholar 

  14. Yu H, Sun C, Yang X et al (2016) ODOC-ELM: optimal decision outputs compensation-based extreme learning machine for classifying imbalanced data[J]. Knowl-Based Syst 92:55–70

    Article  Google Scholar 

  15. Du KL SMNS (2016) Particle swarm optimization[M]. search and optimization by metaheuristics. Springer Int Publish:153–173

  16. Han F, Zhao MR, Zhang JM et al (2017) An improved incremental constructive single-hidden-layer feedforward networks for extreme learning machine based on particle swarm optimization[J]. Neurocomputing 228:133–142

    Article  Google Scholar 

  17. Zhu X, Ni Z, Cheng M, Jin F, Li J, Weckman G (2018) Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast[J]. Appl Intell 48(7):1757–1775

    Article  Google Scholar 

  18. Wang S, Minku LL, Yao X (2015) Resampling-based ensemble methods for online class imbalance learning [J]. IEEE Trans Knowl Data Eng 27(5):1356–1368

    Article  Google Scholar 

  19. Han D, Giraud-Carrier C, Li S (2015) Efficient mining of high-speed uncertain data streams[J]. Appl Intell 43(4):773–785

    Article  Google Scholar 

  20. Lu J, Zhao P, Hoi SCH (2016) Online passive-aggressive active learning[J]. Mach Learn 103(2):141–183

    Article  MathSciNet  MATH  Google Scholar 

  21. Wang J, Zhao P, Hoi SCH et al (2014) Online feature selection and its applications[J]. IEEE Trans Knowl Data Eng 26(3):698–710

    Article  Google Scholar 

  22. Orabona F, Keshet J, Caputo B (2009) Bounded kernel-based online learning[J]. J Mach Learn Res 10(11):2643–2666

    MathSciNet  MATH  Google Scholar 

  23. Huang GB, Chen L (2007) Convex incremental extreme learning machine[J]. Neurocomputing 70(16):3056–3062

    Article  Google Scholar 

  24. Feng G, Huang GB, Lin Q et al (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning[J]. IEEE Trans Neural Netw 20(8):1352–1357

    Article  Google Scholar 

  25. Jiuwen Cao ZL (2012) Guang-bin Huang. Self-adaptive evolutionary extreme learning machine[J]. Neural Process Lett 36(3):285–305

    Article  Google Scholar 

  26. Bai Z, Huang GB, Wang D et al (2014) Sparse extreme learning machine for classification[J]. IEEE Trans Cybernet 44(10):1858–1870

    Article  Google Scholar 

  27. Zhang R, Lan Y, Huang G et al (2012) Universal approximation of extreme learning machine with adaptive growth of hidden nodes[J]. IEEE Trans Neural Netw Learn Syst 23(2):365–371

    Article  Google Scholar 

  28. Zhang R, Lan Y, Huang GB et al (2013) Dynamic extreme learning machine and its approximation capability[J]. IEEE Trans Cybernet 43(6):2054–2065

    Article  Google Scholar 

  29. Cavallanti G, Cesa-Bianchi N, Gentile C (2007) Tracking the best hyperplane with a simple budget perceptron[J]. Mach Learn 69(2):143–167

    Article  Google Scholar 

  30. Tang J, Deng C, Huang G-B (2016) Extreme learning machine for multilayer perceptron. IEEE Trans Neur Netw Learn Syst 27(4):809–821

    Article  MathSciNet  Google Scholar 

  31. Scardapane S, Comminiello D, Scarpiniti M, Uncini A (2015) Online sequential extreme learning machine with kernels[J]. IEEE Trans Neural Netw Learn Syst 26(9):2214–2220

    Article  MathSciNet  Google Scholar 

  32. Lan Y, Soh YC, Huang GB (2009) Ensemble of online sequential extreme learning machine. Neurocomputing 72(15):3391–3395

    Article  Google Scholar 

  33. Cao J et al (2012) Voting based extreme learning machine[J]. Inf Sci 185(1):66–77

    Article  MathSciNet  Google Scholar 

  34. Tian HX, Mao ZZ (2010) An ensemble ELM based on modified AdaBoost. RT algorithm for predicting the temperature of molten steel in ladle furnace. IEEE Trans Autom Sci Eng 7(1):73–80

    Article  Google Scholar 

  35. Li K, Kong X, Lu Z et al (2014) Boosting weighted ELM for imbalanced learning[J]. Neurocomputing 128:15–21

    Article  Google Scholar 

  36. Zhang B, Ma Z, Liu Y, Yuan H, Sun L (2018) Ensemble based reactivated regularization extreme learning machine for classification. Neurocomputing 275:255–266

    Article  Google Scholar 

  37. Zhu X, Ni Z, Cheng M et al (2017) Selective ensemble based on extreme learning machine and improved discrete artificial fish swarm algorithm for haze forecast[J]. Appl Intell:1–19

  38. Liu Y, He B, Dong D, Shen Y, Yan T, Nian R, Lendasse A (2015) Particle swarm optimization based selective ensemble of online sequential extreme learning machine[J]. Math Probl Eng:1–10

  39. Zhang Y, Liu B, Yu J (2017) A selective ensemble learning approach based on evolutionary algorithm. J Intel Fuzzy Syst 32(3):2365–2373

    Article  Google Scholar 

  40. Liu T, Deng Y, Chan F (2018) Evidential supplier selection based on DEMATEL and game theory[J]. Int J Fuzzy Syst 20(4):1321–1333

    Article  Google Scholar 

  41. Frank A, Asuncion A. UCI Machine Learning Repository [http://archive.ics. uci.edu/ml]. Irvine

Download references

Acknowledgements

The work was supported by the National Key Research Project of China under Grant No.2016YFB1001304, the National Natural Science Foundation of China under Grant 61572229, the JLUSTIRT High-level Innovation Team, and the Fundamental Research Funds for Central Universities under Grant No.2017TD-19. The authors gratefully acknowledge financial support from the Research Centre for Intelligent Signal Identification and Equipment, Jilin Province.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaoying Sun.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, H., Sun, X. & Wang, J. Ensemble OS-ELM based on combination weight for data stream classification. Appl Intell 49, 2382–2390 (2019). https://doi.org/10.1007/s10489-018-01403-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-018-01403-2

Keywords

Navigation