Abstract
Ensemble learning is one of the machine learning approaches, which can be described as the process of combining diverse models to solve a particular computational intelligence problem. We can find the analogy to this approach in human behavior (e.g. consulting more experts before taking an important decision). Ensemble learning is advantageously used for improving the performance of classification or prediction models. The whole process strongly depends on the process of determining the weights of base methods. In this paper we investigate different weighting schemes of predictive base models including biologically inspired genetic algorithm (GA) and particle swarm optimization (PSO) in the domain of electricity consumption. We were particularly interested in their ability to improve the performance of ensemble learning in the presence of different types of concept drift that naturally occur in electricity load measurements. The PSO proves to have the best ability to adapt to the sudden changes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Archer, K.J.: rpartOrdinal: an R package for deriving a classification tree for predicting an ordinal response. J. Stat. Softw. 34(7), 1–17 (2010)
Bellio, R., Ventura, L.: An Introduction to Robust Estimation with R Functions, http://www.dst.unive.it/rsr/BelVenTutorial.pdf (2005)
Bendtsen, C.: pso: Particle Swarm Optimization. R package version 1.0.3. http://cran.r-project.org/package=pso (2012)
Boser, B.E. et al.: A training algorithm for optimal margin classifiers. In: Proceedings 5th Annual ACM Workshop on Computational Learning Theory. pp. 144–152 (1992)
Box, G.E.P., et al.: Time Series Analysis: Forecasting and Control. Wiley, New Jersey (2008)
Breiman, L. et al.: Classification and Regression Trees. Chapman and Hall (1984)
Cleveland, R.B., et al.: STL: A Seasonal-Trend Decomposition Procedure Based on Loess. J. Off. Stat. 6(1), 3–73 (1990)
Guo, L., et al.: An incremental extreme learning machine for online sequential learning problems. Neurocomputing 128, 50–58 (2014)
Hadavandi, E. et al.: Developing a time series model based on particle swarm optimization for gold price forecasting. In: 2010 Third International Conference on Business Intelligence and Financial Engineering (BIFE) (2010)
Holland, J.: Adaptation in Natural and Artificial Systems. MIT Press, Cambridge (1992)
Hyndman, R.J. et al.: Package “forecast.” http://cran.r-project.org/web/packages/forecast/forecast.pdf (2015)
Ikonomovska, E., et al.: Learning model trees from evolving data streams. Data Min. Knowl. Discov. 23(1), 128–168 (2011)
Kennedy, J. et al.: Swarm Intelligence. Morgan Kaufmann Publishers (2001)
Karatzoglou, A., et al.: kernlab— An S4 Package for Kernel Methods in R. J. Stat. Softw. 11(9), 1–20 (2004)
Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the ICNN’95—International Conference on Neural Networks, vol. 4 (1995)
Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51(2), 181–207 (2003)
Kvasnčka, V. et al.: Evolučné algoritmy. STU Bratislava (2000)
Liaw, A., Wiener, M.: Classification and regression by random forest. R News. 2(3), 18–22 (2002)
Mendes-Moreira, J., et al.: Ensemble approaches for regression: a survey. ACM Comput. Surv. 45(1), 1–40 (2012)
Minku, L.L.: Online ensemble learning in the presence of concept drift. University of Birmingham (2011)
Moreira-Matias, L. et al.: On predicting the taxi-passenger demand: A real-time approach. Lecture Notes in Computer Science (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), pp. 54–65 (2013)
Nau, R.: Moving average and exponential smoothing models. http://people.duke.edu/~rnau/411avg.htm, (2015)
Oza, N.C.: Online bagging and boosting. In: 2005 IEEE International Conference on Systems, Man and Cybernetics (2005)
Reid, S.: A Review of Heterogeneous Ensemble Methods. http://www.colorado.edu/physics/pion/srr/research/heterogeneous-ensemble-review-reid.pdf (2007)
Shen, W. et al.: An ensemble model for day-ahead electricity demand time series forecasting. In: Proceedings of the Fourth International Conference on Future Energy Systems—e-Energy ’13, p. 51 (2013)
Therneau, T. et al.: rpart: Recursive Partitioning and Regression Trees. R package version 4.1–8. http://cran.r-project.org/package=rpart (2014)
Tso, G.K.F., Yau, K.K.W.: Predicting electricity energy consumption: a comparison of regression analysis, decision tree and neural networks. Energy 32(9), 1761–1768 (2007)
Wang, Y. et al.: Crystal structure prediction via particle-swarm optimization. Phys. Rev. B—Condens. Matter Mater. Phys. 82, 9 (2010)
Wichard, J.D., Ogorzałek, M.: Time series prediction with ensemble models. In: IEEE International Conference on Neural Networks—Conference Proceedings, pp. 1625–1630 (2004)
Xiao, L. et al.: A combined model based on data pre-analysis and weight coefficients optimization for electrical load forecasting. Energy (2015)
Xie, W. et al.: incremental learning with support vector data description. In: 22nd International Conference on Pattern Recognition, pp. 3904–3909 (2014)
Zang, W. et al.: Comparative study between incremental and ensemble learning on data streams: case study. J. Big Data. 1, 1, 5 (2014)
Zhang, F., et al.: A conjunction method of wavelet transform-particle swarm optimization-support vector machine for streamflow forecasting. J. Appl. Math. 2014, 1–10 (2014)
Zhang, P. et al.: Categorizing and mining concept drifting data streams. Int. Conf. Knowl. Discov. Data Min. 8 (2008)
Zhang, P., et al.: Robust ensemble learning for mining noisy data streams. Decis. Support Syst. 50(2), 469–479 (2011)
Acknowledgement
This work was partially supported by the Research and Development Operational Programme for the project “International Centre of Excellence for Research of Intelligent and Secure Information-Communication Technologies and Systems”, ITMS 26,240,120,039, co-funded by the ERDF and the Scientific Grant Agency of The Slovak Republic, grant No. VG 1/0752/14 and VG 1/1221/12.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Grmanová, G. et al. (2016). Application of Biologically Inspired Methods to Improve Adaptive Ensemble Learning. In: Pillay, N., Engelbrecht, A., Abraham, A., du Plessis, M., Snášel, V., Muda, A. (eds) Advances in Nature and Biologically Inspired Computing. Advances in Intelligent Systems and Computing, vol 419. Springer, Cham. https://doi.org/10.1007/978-3-319-27400-3_21
Download citation
DOI: https://doi.org/10.1007/978-3-319-27400-3_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-27399-0
Online ISBN: 978-3-319-27400-3
eBook Packages: Computer ScienceComputer Science (R0)