Skip to main content
Log in

AR–ARCH Type Artificial Neural Network for Forecasting

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Real-world time series such as econometric time series are rarely linear and they have characteristics of volatility. Although autoregressive conditional heteroscedasticity models have used for forecasting financial time series, these models are specific models for time series, so they are not generally applied for all-time series. ARCH–GARCH models usually applied on financial time series. Because, since these time series include features like volatility clustering and leptokurtic and therefore cause problem of heteroscedastic. These problems can be handled thanks to these models. However, These model can be modelled by ARCH–GARCH models only if they include arch effect after being checked that whether ARCH effect exists or not. Therefore, in recent years artificial neural networks have been commonly used various fields by many researchers for any nonlinear-or linear time series, especially multiplicative neuron model-based artificial neural networks are commonly used that have successful forecasting results. It is known that hybrid methods in artificial neural networks are useful techniques for forecasting time series. In this study, a new hybrid forecasting method has a multiplicative neural network structure AR–ARCH–ANN model has been proposed. The proposed method is a recurrent model and also it can model volatility with having autoregressive conditional heteroscedasticity structure. In the proposed approach, particle swarm optimization is used for training neural network. Possibilities of avoiding local minimum traps are increased by this algorithm in using trained process. Istanbul Stock Exchange daily data sets from 2011 to 2013 and some time series in using for 2016 International Time Series Forecasting Competition are obtained to evaluate the forecasting performance of AR–ARCH–ANN. Then, results produced by the proposed method were compared with other methods and it has better performance from other methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Engle RF (1982) Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50:987–1007. https://doi.org/10.2307/1912773

    Google Scholar 

  2. Bollerslev T (1986) Generalized autoregressive conditional heteroskedasticity. J Econ 31:307–327. https://doi.org/10.1016/0304-4076(86)90063-1

    Google Scholar 

  3. Wong WC, Yip F, Xu L (1998) Financial prediction by finite mixture GARCH model. In: Omori T, Usui S (eds) 5th international conference neural information processing ICONIP’98. Kitakyushu, pp 1351–1354

  4. Tang H Chiu KC, Xu L (2003) Finite mixture of ARMA-GARCH model for stock price prediction. In: Proceedings 3rd international workshop on computational intelligence in economics and finance, pp 1112–1119

  5. Werbos PJ (1974) Beyond regression: new tools for prediction and analysis in the behavioral sciences. Harvard University, Cambridge

    Google Scholar 

  6. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323:533–536

    Google Scholar 

  7. Zhang G, Patuwo BE, Hu YM (1998) Forecasting with artificial neural networks: the state of the art. Int J Forecast 14:35–62. https://doi.org/10.1016/S0169-2070(97)00044-7

    Google Scholar 

  8. Smith KA, Gupta JND (2001) Neural networks in business: techniques and applications. Idea Group Inc (IGI), Hershey

    Google Scholar 

  9. Rivas VM, Merelo JJ, Castillo PA et al (2004) Evolving RBF neural networks for time-series forecasting with EvRBF. Inf Sci 165:207–220. https://doi.org/10.1016/j.ins.2003.09.025

    Google Scholar 

  10. Aslanargun A, Mammadov M, Yazici B, Yolacan S (2007) Comparison of ARIMA, neural networks and hybrid models in time series: tourist arrival forecasting. J Stat Comput Simul 77:29–53. https://doi.org/10.1080/10629360600564874

    Google Scholar 

  11. Lin W-M, Gow H-J, Tsai M-T (2010) An enhanced radial basis function network for short-term electricity price forecasting. Appl Energy 87:3226–3234. https://doi.org/10.1016/j.apenergy.2010.04.006

    Google Scholar 

  12. Shen W, Guo X, Wu C, Wu D (2011) Forecasting stock indices using radial basis function neural networks optimized by artificial fish swarm algorithm. Knowl Based Syst 24:378–385. https://doi.org/10.1016/j.knosys.2010.11.001

    Google Scholar 

  13. Li-Xia L, Yi-Qi Z, Liu X (2011) Tax forecasting theory and model based on SVM optimized by PSO. Expert Syst Appl 38:116–120. https://doi.org/10.1016/j.eswa.2010.06.022

    Google Scholar 

  14. Wu J-D, Liu J-C (2012) A forecasting system for car fuel consumption using a radial basis function neural network. Expert Syst Appl 39:1883–1888. https://doi.org/10.1016/j.eswa.2011.07.139

    Google Scholar 

  15. Gundogdu O, Egrioglu E, Aladag CH, Yolcu U (2016) Multiplicative neuron model artificial neural network based on Gaussian activation function. Neural Comput Appl 27:927–935. https://doi.org/10.1007/s00521-015-1908-x

    Google Scholar 

  16. Roy A, Kim LS, Mukhopadhyay S (1993) A polynomial time algorithm for the construction and training of a class of multilayer perceptrons. Neural Netw 6:535–545. https://doi.org/10.1016/S0893-6080(05)80057-7

    Google Scholar 

  17. Wang Z, Di Massimo C, Tham MT, Julian Morris A (1994) A procedure for determining the topology of multilayer feedforward neural networks. Neural Netw 7:291–300. https://doi.org/10.1016/0893-6080(94)90023-X

    Google Scholar 

  18. Reed R (1993) Pruning algorithms-a survey. Trans Neural Netw 4:740–747. https://doi.org/10.1109/72.248452

    Google Scholar 

  19. Sietsma J, Dow RJF (1988) Neural net pruning—why and how. In: Neural networks, pp 325–333

  20. Egrioglu E, Aladag C, Gunay S (2008) A new model selection strategy in artificial neural networks. Appl Math Comput 195:591–597. https://doi.org/10.1016/j.amc.2007.05.005

    Google Scholar 

  21. Rojas I, Cabestany J, Catala A (2015) Advances in artificial neural networks and computational intelligence. Neural Process Lett 42:1–3

    Google Scholar 

  22. Yadav RN, Kalra PK, John J (2007) Time series prediction with single multiplicative neuron model. Appl Soft Comput J 7:1157–1163. https://doi.org/10.1016/j.asoc.2006.01.003

    Google Scholar 

  23. Zhao L, Yang Y (2009) PSO-based single multiplicative neuron model for time series prediction. Expert Syst Appl 36:2805–2812. https://doi.org/10.1016/j.eswa.2008.01.061

    Google Scholar 

  24. Samanta B (2011) Prediction of chaotic time series using computational intelligence. Expert Syst Appl 38:11406–11411. https://doi.org/10.1016/j.eswa.2011.03.013

    Google Scholar 

  25. Yolcu OC, Bas E, Egrioglu E, Yolcu U (2017) Single multiplicative neuron model artificial neural network with autoregressive coefficient for time series modelling. Neural Process Lett. https://doi.org/10.1007/s11063-017-9686-3

    Google Scholar 

  26. Yu J, Xi L, Wang S (2007) An improved particle swarm optimization for evolving feedforward artificial neural networks. Neural Process Lett 26(3):217–231

    Google Scholar 

  27. Yolcu U, Egrioglu E, Aladag CH (2013) A new linear & nonlinear artificial neural network model for time series forecasting. Decis Support Syst 54:1340–1347. https://doi.org/10.1016/j.dss.2012.12.006

    Google Scholar 

  28. Aladag CH, Yolcu U, Egrioglu E, Dalar AZ (2012) A new time invariant fuzzy time series forecasting method based on particle swarm optimization. Appl Soft Comput 12:3291–3299. https://doi.org/10.1016/j.asoc.2012.05.002

    Google Scholar 

  29. Aladag CH, Yolcu U, Egrioglu E (2013) A new multiplicative seasonal neural network model based on particle swarm optimization. Neural Process Lett 37:251–262. https://doi.org/10.1007/s11063-012-9244-y

    Google Scholar 

  30. Alpaslan F, Ilter D, Dalar AZ, Egrioglu E (2014) Çarpımsal Sinir Hücresi Modeline Dayalı Yapay Sinir Ağının Yapay Arı Kolonisi Algoritması ile Eğitimi. Erciyes Üniversitesi Fen Bilim Enstitüsü Derg 30:363–375

    Google Scholar 

  31. Bas E (2016) The training of multiplicative neuron model based artificial neural networks with differential evolution algorithm for forecasting. J Artif Intell Soft Comput Res 6:5–11. https://doi.org/10.1515/jaiscr-2016-0001

    Google Scholar 

  32. Donate PJ, Cortez P (2014) Evolutionary optimization of sparsely connected and time-lagged neural networks for time series forecasting. Appl Soft Comput 23:432–443. https://doi.org/10.1016/j.asoc.2014.06.041

    Google Scholar 

  33. Štěpnička M, Cortez P, Donate PJ, Štěpničková L (2013) Forecasting seasonal time series with computational intelligence: on recent methods and the potential of their combinations. Expert Syst Appl 40(6):1981–1992. https://doi.org/10.1016/j.eswa.2012.10.001

    Google Scholar 

  34. Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training. In: Proceedings of the 2002 international joint conference on neural networks, IJCNN ‘02. https://doi.org/10.1109/ijcnn.2002.1007808

  35. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: IEEE international conference particle swarm optimizer, pp 1942–1948

  36. Ma Y, Jiang C, Hou Z, Wang C (2006) The formulation of the optimal strategies for the electricity producers based on the particle swarm optimization algorithm. IEEE Trans Power Syst 21:1663–1671. https://doi.org/10.1109/TPWRS.2006.883676

    Google Scholar 

  37. Shi Y, Eberhart R (1999) Empirical study of particle swarm optimization. In: Proceedings 1999 congress on evolutionary computation, pp 1945–1950

  38. van den Bergh F, Engelbrecht AP (2002) A new locally convergent particle swarm optimiser. IEEE Int Conf Syst Man Cybern. https://doi.org/10.1109/ICSMC.2002.1176018

    Google Scholar 

  39. McCulloch WS, Pitts W (1943) A logical calculus of the ideas immanent in nervous activity. Bull Math Biophys 5:115–133. https://doi.org/10.1007/BF02478259

    Google Scholar 

  40. Song Q, Chissom BS (1993) Forecasting enrollments with fuzzy time series—part I. Fuzzy Sets Syst 54:1–9. https://doi.org/10.1016/0165-0114(93)90355-L

    Google Scholar 

  41. Türkşen IB (2008) Fuzzy functions with LSE. Appl Soft Comput 8:1178–1188. https://doi.org/10.1016/j.asoc.2007.12.004

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ali Zafer Dalar.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 98 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Corba, B.S., Egrioglu, E. & Dalar, A.Z. AR–ARCH Type Artificial Neural Network for Forecasting. Neural Process Lett 51, 819–836 (2020). https://doi.org/10.1007/s11063-019-10117-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-019-10117-6

Keywords

Navigation