Skip to main content
Log in

A mutual association based nonlinear ensemble mechanism for time series forecasting

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Forecasting a time series with reasonable accuracy is an important but quite difficult task that has been attracting lots of research attention for many years. A widely approved fact is that combining forecasts from multiple models significantly improves the forecasting precision as well as often produces better forecasts than each constituent model. The existing literature is accumulated with linear methods of combining forecasts but nonlinear approaches have received very limited research attention, so far. This paper proposes a novel nonlinear forecasts combination mechanism in which the combined model is constructed from the individual forecasts and the mutual dependencies between pairs of forecasts. The individual forecasts are performed through three well recognized models, whereas five correlation measures are investigated for estimating the mutual association between two different forecasts.Empirical analysis with six real-world time series demonstrates that the proposed ensemble substantially reduces the forecasting errors and also outperforms each component model as well as other conventional linear combination methods, in terms of out-of-sample forecasting accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Adhikari R, Agrawal RK (2012) A novel weighted ensemble technique for time series forecasting. In: Pacific Asia Conference on Knowledge Discovery and Data Mining(PAKDD) (ed) Advances in knowledge discovery and data mining, pp 38–49

  2. Adhikari R, Agrawal RK (2012) Performance evaluation of weights selection schemes for linear combination of multiple forecasts. Artif Intell Rev 42 (4). doi:10.1007/s10462-012-9361-z

  3. Adhikari R, Agrawal RK (2014) A combination of artificial neural network and random walk models for financial time series forecasting. Neural Comput and Applic 24(6):1441–1449

    Article  Google Scholar 

  4. Aiolfi M, Timmermann A (2006) Persistence in forecasting performance and conditional combination strategies. J Econom 135(1):31–53

    Article  MathSciNet  Google Scholar 

  5. Aksu C, Gunter SI (1992) An empirical analysis of the accuracy of sa, ols, erls and nrls combination forecasts. Int J Forecast 8(1):27–43

    Article  MATH  Google Scholar 

  6. Andrawis RR, Atiya AF, El-Shishiny H (2011) Forecast combinations of computational intelligence and linear models for the NN5 time series forecasting competition. Int J Forecast 27(3):672–688

    Article  Google Scholar 

  7. Armstrong JS (2001) Principles of forecasting: a handbook for researchers and practitioners, vol 30. Kluwer Academic Publishers, Boston

    Book  Google Scholar 

  8. Bates JM, Granger CWJ (1969) The combination of forecasts. Oper Res Q 20(4):451–468

    Article  Google Scholar 

  9. Box GEP, Jenkins GM (1970) Time series analysis: Forecasting and control, 3rd edn. Holden-Day, California

    MATH  Google Scholar 

  10. Bunn DW (1975) A Bayesian approach to the linear combination of forecasts. Oper Res Q:325–329

  11. Campbell M, Walker A (1977) A survey of statistical work on the Mackenzie River series of annual Canadian lynx trappings for the years 1821-1934 and a new analysis. J R Stat Soc Ser A (general):411–431

  12. Clemen RT (1989) Combining forecasts: A review and annotated bibliography. Int J Forecast 5(4):559–583

    Article  Google Scholar 

  13. De Gooijer JG, Hyndman RJ (2006) 25 years of time series forecasting. Int J Forecast 22(3):443–473

    Article  Google Scholar 

  14. Delft center for systems and control (2013) MATLAB toolbox ARMASA [online]. http://www.dcsc.tudelft.nl/Research/Software

  15. Demuth H, Beale M, Hagan M (2010) Neural network toolbox user’s guide. The MathWorks Natic, MA

  16. Faraway J, Chatfield C (1998) Time series forecasting with neural networks: A comparative study using the air line data. J R Stat Soc Ser C Appl Stat 47(2):231–250

    Article  Google Scholar 

  17. Freitas PS, Rodrigues AJ (2006) Model combination in neural-based forecasting. Eur J Oper Res 173(3):801–814

    Article  MathSciNet  MATH  Google Scholar 

  18. Gheyas IA, Smith LS (2011) A novel neural network ensemble architecture for time series forecasting. Neurocomputing 74(18):3855–3864

    Article  Google Scholar 

  19. Golub GH, Van Loan CF (2012) Matrix computations, 3rd edn. The John Hopkins University Press, Baltimore

    Google Scholar 

  20. Granger CWJ, Ramanathan R (1984) Improved methods of combining forecasts. J Forecast 3(2):197–204

    Article  Google Scholar 

  21. Hagan MT, Menhaj MB (1994) Training feedforward networks with the marquardt algorithm. IEEE Trans Neural Netw 5(6):989–993

    Article  Google Scholar 

  22. Hamzaçebi C (2008) Improving artificial neural networks’ performance in seasonal time series forecasting. Inf Sci 178(23):4550–4559

    Article  Google Scholar 

  23. Hamzaçebi C, Akay D, Kutay F (2009) Comparison of direct and iterative artificial neural network forecast approaches in multi-periodic time series forecasting. Expert Syst Appl 36(2):3839–3844

    Article  Google Scholar 

  24. Hollander M, Wolfe DA, Chicken E (2013) Nonparametric statistical methods, vol 751. John Wiley and Sons

  25. Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366

    Article  Google Scholar 

  26. Hyndman RJ (2013) Time Series Data Library (TSDL). http://robjhyndman.com/TSDL/

  27. Jose VRR, Winkler RL (2008) Simple robust averages of forecasts: Some empirical results. Int J Forecast 24(1):163–169

    Article  Google Scholar 

  28. Kohavi R, et al. (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: International joint conference on artificial intelligence (IJCAI), vol 14, pp 1137–1145

  29. Lemke C, Gabrys B (2010) Meta-learning for time series forecasting and forecast combination. Neurocomputing 73(10):2006– 2016

    Article  Google Scholar 

  30. Lim CP, Goh WY (2005) The application of an ensemble of boosted elman networks to time series prediction: A benchmark study. Int J Comput Intell 3(2):119– 126

    Google Scholar 

  31. Nguyen D, Widrow B (1990) Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. In: International Joint Conference on Neural Networks (IJCN), vol 3. IEEE, pp 21–26

  32. Pacific FX database (2013). http://fx.sauder.ubc.ca/data.html

  33. Pellegrini S, Ruiz E, Espasa A (2011) Prediction intervals in conditionally heteroscedastic time series with stochastic components. Int J Forecast 27(2):308–319

    Article  Google Scholar 

  34. Reid DJ (1968) Combining three estimates of gross domestic product. Economica 35:431–444

    Article  Google Scholar 

  35. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by backpropagating errors. Nature 323(6188):533–536

    Article  Google Scholar 

  36. Sheskin D (2000) Handbook of parametric and nonparametric statistical procedures. CRC Press, LLC, Boca Raton

    Google Scholar 

  37. Stock JH, Watson MW (2006) Forecasting with many predictors. Handb Econ Forecast 1:515–554

    Article  MATH  Google Scholar 

  38. Suykens JAK, Vandewalle J (1999) Least squares support vector machines classifiers. Neural Process Lett 9(3):293–300

    Article  MathSciNet  Google Scholar 

  39. Székely GJ, Rizzo ML, Bakirov NK et al (2007) Measuring and testing dependence by correlation of distances. Neural Process Lett 35(6):2769–2794

    MATH  Google Scholar 

  40. Teoh EJ, Tan KC, Xiang C (2006) Estimating the number of hidden neurons in a feedforward network using the singular value decomposition. IEEE Trans Neural Netw 17(6):1623–1629

    Article  Google Scholar 

  41. Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    Book  Google Scholar 

  42. Zhang GP (2007) A neural network ensemble method with jittered training data for time series forecasting. Inf Sci 177(23):5329–5346

    Article  Google Scholar 

Download references

Acknowledgments

The author is grateful to the editor as well as the anonymous reviewers whose useful suggestions have provided significant helps in improving the quality of the present work. The author further expresses his gratitude to the Council of Scientific and Industrial Research (CSIR), India, for the obtained partial financial support to carry out this research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ratnadip Adhikari.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Adhikari, R. A mutual association based nonlinear ensemble mechanism for time series forecasting. Appl Intell 43, 233–250 (2015). https://doi.org/10.1007/s10489-014-0641-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-014-0641-y

Keywords

Navigation