Skip to main content

Application of Biologically Inspired Methods to Improve Adaptive Ensemble Learning

  • Conference paper
  • First Online:
Advances in Nature and Biologically Inspired Computing

Abstract

Ensemble learning is one of the machine learning approaches, which can be described as the process of combining diverse models to solve a particular computational intelligence problem. We can find the analogy to this approach in human behavior (e.g. consulting more experts before taking an important decision). Ensemble learning is advantageously used for improving the performance of classification or prediction models. The whole process strongly depends on the process of determining the weights of base methods. In this paper we investigate different weighting schemes of predictive base models including biologically inspired genetic algorithm (GA) and particle swarm optimization (PSO) in the domain of electricity consumption. We were particularly interested in their ability to improve the performance of ensemble learning in the presence of different types of concept drift that naturally occur in electricity load measurements. The PSO proves to have the best ability to adapt to the sudden changes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Archer, K.J.: rpartOrdinal: an R package for deriving a classification tree for predicting an ordinal response. J. Stat. Softw. 34(7), 1–17 (2010)

    Article  Google Scholar 

  2. Bellio, R., Ventura, L.: An Introduction to Robust Estimation with R Functions, http://www.dst.unive.it/rsr/BelVenTutorial.pdf (2005)

  3. Bendtsen, C.: pso: Particle Swarm Optimization. R package version 1.0.3. http://cran.r-project.org/package=pso (2012)

  4. Boser, B.E. et al.: A training algorithm for optimal margin classifiers. In: Proceedings 5th Annual ACM Workshop on Computational Learning Theory. pp. 144–152 (1992)

    Google Scholar 

  5. Box, G.E.P., et al.: Time Series Analysis: Forecasting and Control. Wiley, New Jersey (2008)

    Book  Google Scholar 

  6. Breiman, L. et al.: Classification and Regression Trees. Chapman and Hall (1984)

    Google Scholar 

  7. Cleveland, R.B., et al.: STL: A Seasonal-Trend Decomposition Procedure Based on Loess. J. Off. Stat. 6(1), 3–73 (1990)

    Google Scholar 

  8. Guo, L., et al.: An incremental extreme learning machine for online sequential learning problems. Neurocomputing 128, 50–58 (2014)

    Article  Google Scholar 

  9. Hadavandi, E. et al.: Developing a time series model based on particle swarm optimization for gold price forecasting. In: 2010 Third International Conference on Business Intelligence and Financial Engineering (BIFE) (2010)

    Google Scholar 

  10. Holland, J.: Adaptation in Natural and Artificial Systems. MIT Press, Cambridge (1992)

    Google Scholar 

  11. Hyndman, R.J. et al.: Package “forecast.” http://cran.r-project.org/web/packages/forecast/forecast.pdf (2015)

  12. Ikonomovska, E., et al.: Learning model trees from evolving data streams. Data Min. Knowl. Discov. 23(1), 128–168 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  13. Kennedy, J. et al.: Swarm Intelligence. Morgan Kaufmann Publishers (2001)

    Google Scholar 

  14. Karatzoglou, A., et al.: kernlab— An S4 Package for Kernel Methods in R. J. Stat. Softw. 11(9), 1–20 (2004)

    Article  Google Scholar 

  15. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the ICNN’95—International Conference on Neural Networks, vol. 4 (1995)

    Google Scholar 

  16. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  17. Kvasnčka, V. et al.: Evolučné algoritmy. STU Bratislava (2000)

    Google Scholar 

  18. Liaw, A., Wiener, M.: Classification and regression by random forest. R News. 2(3), 18–22 (2002)

    Google Scholar 

  19. Mendes-Moreira, J., et al.: Ensemble approaches for regression: a survey. ACM Comput. Surv. 45(1), 1–40 (2012)

    Article  Google Scholar 

  20. Minku, L.L.: Online ensemble learning in the presence of concept drift. University of Birmingham (2011)

    Google Scholar 

  21. Moreira-Matias, L. et al.: On predicting the taxi-passenger demand: A real-time approach. Lecture Notes in Computer Science (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), pp. 54–65 (2013)

    Google Scholar 

  22. Nau, R.: Moving average and exponential smoothing models. http://people.duke.edu/~rnau/411avg.htm, (2015)

  23. Oza, N.C.: Online bagging and boosting. In: 2005 IEEE International Conference on Systems, Man and Cybernetics (2005)

    Google Scholar 

  24. Reid, S.: A Review of Heterogeneous Ensemble Methods. http://www.colorado.edu/physics/pion/srr/research/heterogeneous-ensemble-review-reid.pdf (2007)

  25. Shen, W. et al.: An ensemble model for day-ahead electricity demand time series forecasting. In: Proceedings of the Fourth International Conference on Future Energy Systems—e-Energy ’13, p. 51 (2013)

    Google Scholar 

  26. Therneau, T. et al.: rpart: Recursive Partitioning and Regression Trees. R package version 4.1–8. http://cran.r-project.org/package=rpart (2014)

  27. Tso, G.K.F., Yau, K.K.W.: Predicting electricity energy consumption: a comparison of regression analysis, decision tree and neural networks. Energy 32(9), 1761–1768 (2007)

    Article  Google Scholar 

  28. Wang, Y. et al.: Crystal structure prediction via particle-swarm optimization. Phys. Rev. B—Condens. Matter Mater. Phys. 82, 9 (2010)

    Google Scholar 

  29. Wichard, J.D., Ogorzałek, M.: Time series prediction with ensemble models. In: IEEE International Conference on Neural NetworksConference Proceedings, pp. 1625–1630 (2004)

    Google Scholar 

  30. Xiao, L. et al.: A combined model based on data pre-analysis and weight coefficients optimization for electrical load forecasting. Energy (2015)

    Google Scholar 

  31. Xie, W. et al.: incremental learning with support vector data description. In: 22nd International Conference on Pattern Recognition, pp. 3904–3909 (2014)

    Google Scholar 

  32. Zang, W. et al.: Comparative study between incremental and ensemble learning on data streams: case study. J. Big Data. 1, 1, 5 (2014)

    Google Scholar 

  33. Zhang, F., et al.: A conjunction method of wavelet transform-particle swarm optimization-support vector machine for streamflow forecasting. J. Appl. Math. 2014, 1–10 (2014)

    Google Scholar 

  34. Zhang, P. et al.: Categorizing and mining concept drifting data streams. Int. Conf. Knowl. Discov. Data Min. 8 (2008)

    Google Scholar 

  35. Zhang, P., et al.: Robust ensemble learning for mining noisy data streams. Decis. Support Syst. 50(2), 469–479 (2011)

    Article  Google Scholar 

Download references

Acknowledgement

This work was partially supported by the Research and Development Operational Programme for the project “International Centre of Excellence for Research of Intelligent and Secure Information-Communication Technologies and Systems”, ITMS 26,240,120,039, co-funded by the ERDF and the Scientific Grant Agency of The Slovak Republic, grant No. VG 1/0752/14 and VG 1/1221/12.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabriela Grmanová .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Grmanová, G. et al. (2016). Application of Biologically Inspired Methods to Improve Adaptive Ensemble Learning. In: Pillay, N., Engelbrecht, A., Abraham, A., du Plessis, M., Snášel, V., Muda, A. (eds) Advances in Nature and Biologically Inspired Computing. Advances in Intelligent Systems and Computing, vol 419. Springer, Cham. https://doi.org/10.1007/978-3-319-27400-3_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-27400-3_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-27399-0

  • Online ISBN: 978-3-319-27400-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics