Skip to main content
Log in

A comparative study for forecasting using neural networks vs genetically identified Box&Jenkins models

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

This paper aims to discuss the results and conclusions of an extensive comparative study on the forecasting performance between two different techniques: a genetic expert system in which a genetic algorithm carries out the identification stage embraced in the three- phase Box&Jenkins univariate methodology; and a connectionist approach. At the heart of the former, an expert system rules the identification-estimation-diagnostic checking cyclical process to end up with the predictions provided by the SARIMA model which best fits the data. We will present the connectionist approach as technically equivalent to the latter process and due to its, alas, lack of any conclusive existent algorithm able to identify both the optimal model and architecture for a given problem, the three most common models presently at use and 20 different architectures for each model will be examined. It seems natural that if a comparison is to be made in order to provide a straight answer as to whether or not a connectionist approach outperforms the univariate Box&Jenkins methodology, the benchmark should clearly be the set of time series analysed in the work ‘Time Series Analysis. Forecasting and Control’ by G. E. Box and G. M. Jenkins. Series BJA through to BJG give a total of 1200 plus measures to evaluate and compare the predictive power for different models, architectures, prediction horizons and pre-processing transformations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Yule GU. On a method of investigating periodicities in disturbed series. Phil Trans 1927; A226: 267

    Google Scholar 

  2. Box GEP, Jenkins GM. Time Series Analysis-Forecasting and Control, Holden-Day, San Francisco, 1976

    Google Scholar 

  3. Valls M. Identificació Automàtica de Sèries Temporals, PhD Thesis, UPC Barcelona, 1983

  4. Akaike H. A new look at the statistical model identification. IEEE Trans Auto Control 1974; AC-19: 716–723

    Google Scholar 

  5. Burg JP. Maximum Entropy Espectral Analysis, PhD Thesis, Stanford University, 1975

  6. Goldberg DE. Genetic Algorithms in Search: Optimization and Machine Learning. Addison Wesley, CA, 1989

    Google Scholar 

  7. Tsay RS, Tiao GC. Consistent estimates of autoregressive parameters and extended sample autocorrelation function for stationarity and nonstationarity ARMA models. J Am Statist Assoc 1984; 79: 84–96

    Google Scholar 

  8. Pankratz A. Forecasting with Univariate Box-Jenkins Models. John Wiley & Sons, New York, 1983

    Google Scholar 

  9. Granger C, Anderson T. An Introduction to Bilinear Time Series Models. Vandenhoeck and Ruprecht, Gottingen, 1978

    Google Scholar 

  10. Tong H. Threshold Models in Non-linear Time Series Analysis: Lecture Notes in Statistics. Springer-Verlag, Berlin, 1983

    Google Scholar 

  11. Hertz J, Krogh A, Palmer R. Introduction to the theory of neural computation. Addison-Wesley, CA, 1991

    Google Scholar 

  12. Freeman J, Skapura D. Neural Networks Algorithms, Application. Addison-Wesley, CA, 1991

    Google Scholar 

  13. McClelland J, Rumelhart D. Parallel distributed processing. MIT Press, MA 1988

    Google Scholar 

  14. Refenes A. Constructive learning and its application to currency exchange rate forecasting, 1990

  15. Gallant S. Three constructive algorithms for neural learning. 8th Ann Conf Cog Sci Soc, 1986

  16. Pelillo M, Fanelli M. A method of pruning layered Feddforward NNs. Proc IWANN '93, 1993

  17. Karnin E. A simple procedure for pruning BP trained NNs. IEEE Trans Neurol Networks 1990; V1: 325–333

    Google Scholar 

  18. Eberhart R, Dobbins R. Neural Network PC tools a practical guide. Academic Press, CA, 1990

    Google Scholar 

  19. Cybenko G. Continuous valued Nns with 2 hidden layers are sufficient. Technical report, Tufts University, MA, 1988

    Google Scholar 

  20. Cybenko G. Approximation by superpositions of a sigmoidal function. Math Control, Signals Syst 1989; 303–313

  21. Tang Z, Almeida C, Fishwick D. Time series forecasting using NNs vs. B&J methodology. Simulations 1991

  22. Weigend A, Huberman B,et al. Predicting the future; a connectionist approach. Intl J Neural Syst. 1990; V1: 193–209

    Google Scholar 

  23. Varfis A, Versino C. NNs for economic time series forecasting. Proc ICANN'90, 1990

  24. Groot C, Würtz D. Analysis of univariate time series with connectionist nets. Proc. ICANN'90, 1990

  25. Baestens D, Bergh W,et al. Estimating tax inflows at a public institution. Proc. NN Capital Markets, 1993

  26. Tong H, Lim K. Threshold Autoregression, limit cycles and cyclical data. RSS 1980; B42: 245

    Google Scholar 

  27. Sharda R, Patil R. Connectionist approach to time series prediction: an empirical test. J Intell Manuf 1992; 3: 317–323

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Blake, J., Francino, P., Catot, J.M. et al. A comparative study for forecasting using neural networks vs genetically identified Box&Jenkins models. Neural Comput & Applic 3, 139–148 (1995). https://doi.org/10.1007/BF01414075

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01414075

Keywords

Navigation