Skip to main content

A PSO Boosted Ensemble of Extreme Learning Machines for Time Series Forecasting

  • Conference paper
  • First Online:
Book cover International Joint Conference SOCO’18-CISIS’18-ICEUTE’18 (SOCO’18-CISIS’18-ICEUTE’18 2018)

Abstract

In this work, a first approach of using the Particle Swarm Optimization (PSO) as a method for optimizing an Ensemble Model built with Extreme Learning Machines is presented. The paper focuses on the obtaining of the parameters of a weighted averaging method for a Ensemble Model, using Extreme Learning Machines as models. The main contribution of this document is the use of the heuristic algorithm PSO for searching optimum parameters of the weighted averaging method. The experiments show that PSO is suitable for computing the parameters of the ensemble, obtaining an average improvement of 68% of the error comparing with an individual model. Also other comparisons have been made with basic combining methods of Ensemble Model fulfilling the expectations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Qazi, A., Fayaz, H., Wadi, A., Raj, R.G., Rahim, N., Khan, W.A.: The artificial neural network for solar radiation prediction and designing solar systems: a systematic literature review. J. Clean. Prod. 104, 1–12 (2015)

    Article  Google Scholar 

  2. Zhang, G.P.: Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 50, 159–175 (2003)

    Article  Google Scholar 

  3. Martinetz, T.M., Berkovich, S.G., Schulten, K.J.: ‘Neural-gas’ network for vector quantization and its application to time-series prediction. IEEE Trans. Neural Netw. 4(4), 558–569 (1993)

    Article  Google Scholar 

  4. Porto, A., Irigoyen, E.: Gas consumption prediction based on artificial neural networks for residential sectors. In: Proceedings of the International Joint Conference SOCO 2017-CISIS 2017-ICEUTE 2017, León, Spain, 6–8 September, pp. 102–111. Springer (2017)

    Google Scholar 

  5. Kaastra, I., Boyd, M.: Designing a neural network for forecasting financial and economic time series. Neurocomputing 10(3), 215–236 (1996)

    Article  Google Scholar 

  6. Kurogi, S., Sawa, M., Ueno, T.: Time series prediction of the cats benchmark using fourier bandpass filters and competitive associative nets. Neurocomputing 70(13), 2354–2362 (2007)

    Article  Google Scholar 

  7. Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12(10), 993–1001 (1990)

    Article  Google Scholar 

  8. Soto, J., Melin, P., Castillo, O.: A new approach for time series prediction using ensembles of ANFIS models with interval type-2 and type-1 fuzzy integrators. In: 2013 IEEE Conference on Computational Intelligence for Financial Engineering Economics (CIFEr), pp. 68–73, April 2013

    Google Scholar 

  9. Perrone, M.P., Cooper, L.N.: When networks disagree: ensemble methods for hybrid neural networks. In: How We Learn; How We Remember: Toward An Understanding of Brain and Neural Systems: Selected Papers of Leon N Cooper, pp. 342–358. World Scientific (1995)

    Chapter  Google Scholar 

  10. Wichard, J.D., Ogorzałek, M.: Time series prediction with ensemble models applied to the cats benchmark. Neurocomputing 70(13), 2371–2378 (2007)

    Article  Google Scholar 

  11. Opitz, D.W., Shavlik, J.W.: Actively searching for an effective neural network ensemble. Connection Sci. 8(3–4), 337–354 (1996)

    Article  Google Scholar 

  12. Jafari, S.A., Mashohor, S.: Robust combining methods in committee neural networks. In: 2011 IEEE Symposium on Computers Informatics, pp. 18–22, March 2011

    Google Scholar 

  13. Soto, J., Melin, P., Castillo, O.: Particle swarm optimization of the fuzzy integrators for time series prediction using ensemble of IT2FNN architectures, pp. 141–158. Springer International Publishing, Cham (2017)

    Google Scholar 

  14. Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme learning machine: theory and applications. Neurocomputing 70, 489–501 (2006)

    Article  Google Scholar 

  15. Wang, H., Fan, W., Sun, F., Qian, X.: An adaptive ensemble model of extreme learning machine for time series prediction. In: 2015 12th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), pp. 80–85, December 2015

    Google Scholar 

  16. Dietterich, T.G.: Ensemble methods in machine learning. In: International Workshop on Multiple Classifier Systems, pp. 1–15. Springer (2000)

    Google Scholar 

  17. Claesen, M., De Smet, F., Suykens, J., De Moor, B.: EnsembleSVM: a library for ensemble learning using support vector machines. arXiv preprint arXiv:1403.0745 (2014)

  18. Kocev, D., Vens, C., Struyf, J., Džeroski, S.: Ensembles of multi-objective decision trees. In: European Conference on Machine Learning, pp. 624–631. Springer (2007)

    Google Scholar 

  19. Kachitvichyanukul, V.: Comparison of three evolutionary algorithms: GA, PSO, and DE. Ind. Eng. Manag. Syst. 11(3), 215–223 (2012)

    Google Scholar 

  20. Wichard, J.D., Ogorzalek, M.: Time series prediction with ensemble models. In: 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No. 04CH37541), vol. 2, pp. 1625–1630, July 2004

    Google Scholar 

  21. Bishop, C.M.: Pattern recognition and machine learning. Springer (2006)

    Google Scholar 

  22. Shen, Z.-Q., Kong, F.-S.: Optimizing weights by genetic algorithm for neural network ensemble. In: Advances in Neural Networks, ISNN 2004, pp. 323–331 (2004)

    Chapter  Google Scholar 

  23. Weigend, A.S., Gershenfeld, N.A.: Results of the time series prediction competition at the Santa Fe Institute. In: IEEE International Conference on Neural Networks, 1993, pp. 1786–1793. IEEE (1993)

    Google Scholar 

  24. Lee, T.-H.: Loss functions in time series forecasting. In: International Encyclopedia of the Social Sciences, vol. 9, pp. 495–502 (2008)

    Google Scholar 

  25. Eberhart, R.C., Shi, Y., Kennedy, J.: Swarm Intelligence. Elsevier, London (2001)

    Google Scholar 

  26. Van Den Bergh, F., Engelbrecht, A.P.: A study of particle swarm optimization particle trajectories. Inf. Sci. 176(8), 937–971 (2006)

    Article  MathSciNet  Google Scholar 

  27. Eberhart, R.C., Shi, Y.: Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation, vol. 1, pp. 84–88. IEEE (2000)

    Google Scholar 

Download references

Acknowledgement

This work comes under the framework of the project IT874-13 granted by the Basque Regional Government. The authors would like to thank the company IK4-IDEKO that has supported this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alain Porto .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Porto, A., Irigoyen, E., Larrea, M. (2019). A PSO Boosted Ensemble of Extreme Learning Machines for Time Series Forecasting. In: Graña, M., et al. International Joint Conference SOCO’18-CISIS’18-ICEUTE’18. SOCO’18-CISIS’18-ICEUTE’18 2018. Advances in Intelligent Systems and Computing, vol 771. Springer, Cham. https://doi.org/10.1007/978-3-319-94120-2_31

Download citation

Publish with us

Policies and ethics