Skip to main content

Forecasting the Economy with Neural Nets: A Survey of Challenges and Solutions

  • Chapter
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1524))

Abstract

Macroeconomic forecasting is a very difficult task due to the lack of an accurate, convincing model of the economy. The most accurate models for economic forecasting, “black box” time series models, assume little about the structure of the economy. Constructing reliable time series models is challenging due to short data series, high noise levels, nonstationarities, and nonlinear effects. This chapter describes these challenges and presents some neural network solutions to them. Important issues include balancing the bias/variance tradeoff and the noise/nonstationarity tradeoff. A brief survey of methods includes hyperparameter selection (regularization parameter and training window length), input variable selection and pruning, network architecture selection and pruning, new smoothing regularizers, committee forecasts and model visualization. Separate sections present more in-depth descriptions of smoothing regularizers, architecture selection via the generalized prediction error (GPE) and nonlinear cross-validation (NCV), input selection via sensitivity based pruning (SBP), and model interpretation and visualization. Throughout, empirical results are presented for forecasting the U.S. Index of Industrial Production. These demonstrate that, relative to conventional linear time series and regression methods, superior performance can be obtained using state-of-the-art neural network models.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   74.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. H. Akaike. Statistical predictor identification. Ann. Inst. Statist. Math., 22:203–217, 1970.

    Article  MATH  MathSciNet  Google Scholar 

  2. T. Ash. Dynamic node creation in backpropagation neural networks. Connection Science, 1(4):365–375, 1989.

    Article  Google Scholar 

  3. A. Barron. Predicted squared error: a criterion for automatic model selection. In S. Farlow, editor, Self-Organizing Methods in Modeling. Marcel Dekker, New York, 1984.

    Google Scholar 

  4. R. Battiti. Using mutual information for selecting features in supervised neural net learning. IEEE Trans. on Neural Networks, 5(4):537–550, July 1994.

    Google Scholar 

  5. B. Bonnlander. Nonparametric selection of input variables for connectionist learning. Technical report, PhD Thesis. Department of Computer Science, University of Colorado, 1996.

    Google Scholar 

  6. R.T. Clemen. Combining forecasts: A review and annotated bibliography. International Journal of Forecasting, (5):559–583, 1989.

    Article  Google Scholar 

  7. P. Craven and G. Wahba. Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation. Numer. Math., 31:377–403, 1979.

    Article  MATH  MathSciNet  Google Scholar 

  8. R.L. Eubank. Spline Smoothing and Nonparametric Regression. Marcel Dekker, Inc., 1988.

    Google Scholar 

  9. S. Geman, E. Bienenstock, and R. Doursat. Neural networks and the bias/variance dilemma. Neural Computation, 4(1):1–58, 1992.

    Article  Google Scholar 

  10. F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural network architectures. Neural Computation, 7:219–269, 1995.

    Article  Google Scholar 

  11. G. Golub, H. Heath, and G. Wahba. Generalized cross validation as a method for choosing a good ridge parameter. Technometrics, 21:215–224, 1979.

    Article  MATH  MathSciNet  Google Scholar 

  12. C.W.J. Granger and P. Newbold. Forecasting Economic Time Series. Academic Press, San Diego, California, 2nd edition, 1986.

    MATH  Google Scholar 

  13. C.W.J. Granger and T. Terasvirta. Modelling Nonlinear Economic Relationships. Oxford University Press, 1993.

    Google Scholar 

  14. J.D. Hamilton. Time Series Analysis. Princeton University Press, 1994.

    Google Scholar 

  15. B. Hassibi and D.G. Stork. Second order derivatives for network pruning: Optimal brain surgeon. In Stephen Jose Hanson, Jack D. Cowan, and C. Lee Giles, editors, Advances in Neural Information Processing Systems 5, pages 164–171. Morgan Kaufmann Publishers, San Mateo, CA, 1993.

    Google Scholar 

  16. T.J. Hastie and R.J. Tibshirani. Generalized Additive Models, volume 43 of Monographs on Statistics and Applied Probability. Chapman and Hall, 1990.

    Google Scholar 

  17. A.E. Hoerl and R.W. Kennard. Ridge regression: applications to nonorthogonal problems. Technometrics, 12:69–82, 1970.

    Article  MATH  Google Scholar 

  18. A.E. Hoerl and R.W. Kennard. Ridge regression: biased estimation for nonorthogonal problems. Technometrics, 12:55–67, 1970.

    Article  MATH  Google Scholar 

  19. Y. LeCun, J.S. Denker, and S.A. Solla. Optimal brain damage. In D.S. Touretzky, editor, Advances in Neural Information Processing Systems 2. Morgan Kaufmann Publishers, 1990.

    Google Scholar 

  20. A.U. Levin, T.K. Leen, and J.E. Moody. Fast pruning using principal components. In J. Cowan, G. Tesauro, and J. Alspector, editors, Advances in Neural Information Processing Systems 6. Morgan Kaufmann Publishers, San Francisco,CA, 1994.

    Google Scholar 

  21. Y. Liao and J.E. Moody. A neural network visualization and sensitivity analysis toolkit. In Shun ichi Amari, Lei Xu, Lai-Wan Chan, Irwin King, and Kwong-Sak Leung, editors, Proceedings of the International Conference on Neural Information Processing, pages 1069–74. Springer-Verlag Singapore Pte. Ltd., 1996.

    Google Scholar 

  22. R.B. Litterman. Forecasting with Bayesian vector autoregressions-five years of experience. Journal of Business and Economic Statistics, 4(1):25–38, 1986.

    Article  Google Scholar 

  23. J. Moody. Challenges of Economic Forecasting: Noise, Nonstationarity, and Nonlinearity. Invited talk presented at Machines that Learn, Snowbird Utah, April 1994.

    Google Scholar 

  24. J. Moody. The efiective number of parameters: an analysis of generalization and regularization in nonlinear learning systems. In J.E. Moody, S.J. Hanson, and R.P. Lippmann, editors, Advances in Neural Information Processing Systems 4, pages 847–854. Morgan Kaufmann Publishers, San Mateo, CA, 1992.

    Google Scholar 

  25. J. Moody. Prediction risk and neural network architecture selection. In V. Cherkassky, J.H. Friedman, and H. Wechsler, editors, From Statistics to Neural Networks:Theory and Pattern Recognition Applications. Springer-Verlag, 1994.

    Google Scholar 

  26. J. Moody. Macroeconomic Forecasting: Challenges and Neural Network Solutions. In Proceedings of the International Symposium on Artifigfcial Neural Networks. Hsinchu, Taiwan, 1995. Invited keynote address.

    Google Scholar 

  27. J. Moody, A. Levin, and S. Rehfuss. Predicting the U.S. index of industrial production. Neural Network World, 3(6):791–794, 1993. Special Issue: Proceedings of Parallel Applications in Statistics and Economics’ 93.

    Google Scholar 

  28. J. Moody, S. Rehfuss, and M. Safiell. Macroeconomic forecasting with neural networks. Manuscript in preparation., 1999.

    Google Scholar 

  29. J. Moody and T. Rögnvaldsson. Smoothing regularizers for projective basis function networks. In Advances in Neural Information Processing Systems 9 (Proceedings of NIPS*96). MIT Press, Cambridge, 1997.

    Google Scholar 

  30. J. Moody and J. Utans. Architecture selection strategies for neural networks: Application to corporate bond rating prediction. In A.N. Refenes, editor, Neural Networks in the Captial Markets. John Wiley & Sons, 1994.

    Google Scholar 

  31. J.E. Moody. Note on generalization, regularization and architecture selection in nonlinear learning systems. In B.H. Juang, S.Y. Kung, and C.A. Kamm, editors, Neural Networks for Signal Processing, pages 1–10. IEEE Signal Processing Society, 1991.

    Google Scholar 

  32. J.E. Moody and J. Utans. Principled architecture selection for neural networks: Application to corporate bond rating prediction. In J.E. Moody, S.J. Hanson, and R.P. Lippmann, editors, Advances in Neural Information Processing Systems 4, pages 683–690. Morgan Kaufmann Publishers, San Mateo, CA, 1992.

    Google Scholar 

  33. M.C. Mozer and P. Smolensky. Skeletonization: A technique for trimming the fat from a network via relevance assessment. In David S. Touretzky, editor, Advances in Neural Information Processing Systems 1. Morgan Kaufmann Publishers, San Mateo, CA, 1990.

    Google Scholar 

  34. N. Murata, S. Yoshizawa, and S. Amari. Network information criterion-determining the number of hidden units for an artificial neural network model. IEEE Transactions on Neural Networks, 5(6):865–872, 1994.

    Article  Google Scholar 

  35. M. Natter, C. Haefke, T. Soni, and H. Otruba. Macroeconomic forecasting using neural networks. In Neural Networks in the Capital Markets 1994, 1994.

    Google Scholar 

  36. H. Pi and C. Peterson. Finding the embedding dimension and variable dependencies in time series. Neural Computation, pages 509–520, 1994.

    Google Scholar 

  37. D. Plaut, S. Nowlan, and G. Hinton. Experiments on learning by back propagation. Technical Report CMU-CS-86-126, Dept. of Computer Science, Carnegie-Mellon University, Pittsburgh, Pennsylvania, 1986.

    Google Scholar 

  38. S. Rehfuss. Macroeconomic forecasting with neural networks. Unpublished simulations. 1994.

    Google Scholar 

  39. H. Rehkugler and H.G. Zimmermann, editors. Neuronale Netze in der ökonomie. Verlag Vahlen, 1994.

    Google Scholar 

  40. N.R. Swanson and H. White. A model selection approach to real-time macroeconomic forecasting using linear models and artificial neural networks. Discussion paper, Department of Economics, Pennsylvania State University, 1995.

    Google Scholar 

  41. J. Utans and J. Moody. Selecting neural network architectures via the prediction risk: Application to corporate bond rating prediction. In Proceedings of the First International Conference on Artificial Intelligence Applications on Wall Street. IEEE Computer Society Press, Los Alamitos, CA, 1991.

    Google Scholar 

  42. J. Utans, J. Moody, and S. Rehfuss. Selecting input variables via sensitivity analysis: Application to predicting the U.S. business cycle. In Proceedings of Computational Intelligence in Financial Engineering. IEEE Press, 1995.

    Google Scholar 

  43. G. Wahba. Spline models for observational data. CBMS-NSF Regional Conference Series in Applied Mathematics, 1990.

    Google Scholar 

  44. R.L. Winkler and S. Makridakis. The combination of forecasts. Journal of Royal Statistical Society, (146), 1983.

    Google Scholar 

  45. L. Wu and J. Moody. A smoothing regularizer for feedforward and recurrent networks. Neural Computation, 8(2), 1996.

    Google Scholar 

  46. H. Yang and J. Moody. Input variable selection based on joing mutual information. Technical report, Department of Computer Science, Oregon Graduate Institute, 1998.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Moody, J. (1998). Forecasting the Economy with Neural Nets: A Survey of Challenges and Solutions. In: Orr, G.B., Müller, KR. (eds) Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol 1524. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-49430-8_17

Download citation

  • DOI: https://doi.org/10.1007/3-540-49430-8_17

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-65311-0

  • Online ISBN: 978-3-540-49430-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics