Skip to main content

Minimum Message Length Order Selection and Parameter Estimation of Moving Average Models

  • Chapter
  • 1597 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7070))

Abstract

This paper presents a novel approach to estimating a moving average model of unknown order from an observed time series based on the minimum message length principle (MML). The nature of the exact Fisher information matrix for moving average models leads to problems when used in the standard Wallace–Freeman message length approximation, and this is overcome by utilising the asymptotic form of the information matrix. By exploiting the link between partial autocorrelations and invertible moving average coefficients an efficient procedure for finding the MML moving average coefficient estimates is derived. The MML estimating equations are shown to be free of solutions at the boundary of the invertibility region that result in the troublesome “pile-up” effect in maximum likelihood estimation. Simulations demonstrate the excellent performance of the MML criteria in comparison to standard moving average inference procedures in terms of both parameter estimation and order selection, particularly for small sample sizes.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Akaike, H.: A new look at the statistical model identification. IEEE Transactions on Automatic Control 19(6), 716–723 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  2. Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6(2), 461–464 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  3. Wallace, C.S.: Statistical and Inductive Inference by Minimum Message Length, 1st edn. Information Science and Statistics. Springer (2005)

    Google Scholar 

  4. Sak, M., Dowe, D., Ray, S.: Minimum message length moving average time series data mining. In: Proceedings of the ICSC Congress on Computational Intelligence Methods and Applications (ACFM 2005), Istanbul, Turkey (2005)

    Google Scholar 

  5. Schmidt, D.F.: Minimum Message Length Inference of Autoregressive Moving Average Models. PhD thesis, Clayton School of Information Technology, Monash University (2008)

    Google Scholar 

  6. Solomonoff, R.J.: A formal theory of inductive inference. Information and Control 7(2), 1–22, 224–254 (1964)

    Google Scholar 

  7. Farr, G.E., Wallace, C.S.: The complexity of strict minimum message length inference. Computer Journal 45(3), 285–292 (2002)

    Article  MATH  Google Scholar 

  8. Wallace, C.S., Freeman, P.R.: Estimation and inference by compact coding. Journal of the Royal Statistical Society (Series B) 49(3), 240–252 (1987)

    MathSciNet  MATH  Google Scholar 

  9. Porat, B., Friedlander, B.: Computation of the exact information matrix of Gaussian time series with stationary random components. IEEE Transactions on Acoustics, Speech and Signal Processing 34(1), 118–130 (1986)

    Article  MathSciNet  Google Scholar 

  10. Gardner, G., Harvey, A.C., Phillips, G.D.A.: Algorithm AS 154: An algorithm for exact maximum likelihood estimation of autoregressive-moving average models by means of Kalman filtering. Applied Statistics 29(3), 311–322 (1980)

    Article  MATH  Google Scholar 

  11. Fitzgibbon, L.J., Dowe, D.L., Vahid, F.: Minimum message length autoregressive model order selection. In: Proceedings of the International Conference on Intelligent Sensing and Information Processing (ICISIP), pp. 439–444 (2004)

    Google Scholar 

  12. Piccolo, D.: The size of the stationarity and invertibility region of an autoregressive moving average process. Journal of Time Series Analysis 3(4), 245–247 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  13. Makalic, E., Schmidt, D.F.: Fast computation of the Kullback-Leibler divergence and exact Fisher information for the first-order moving average model. IEEE Signal Processing Letters 17(4), 391–393 (2009)

    Article  Google Scholar 

  14. Whittle, P.: The analysis of multiple stationary time series. Journal of the Royal Statistical Society, Series B (Methodological) 15(1), 125–139 (1953)

    MathSciNet  MATH  Google Scholar 

  15. Barndorff-Nielsen, O., Schou, G.: On the parametrization of autoregressive models by partial autocorrelations. Journal of Multivariate Analysis 3, 408–419 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  16. Haughton, D.M.A.: On the choice of a model to fit data from an exponential family. The Annals of Statistics 16(1), 342–355 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  17. Rissanen, J., Caines, P.E.: The strong consistency of maximum likelihood estimators for ARMA processes. The Annals of Statistics 7(2), 297–315 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  18. Davidson, J.E.H.: Problems with the estimation of moving average processes. Journal of Econometrics 16(3), 295–310 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  19. Kullback, S., Leibler, R.A.: On information and sufficiency. The Annals of Mathematical Statistics 22(1), 79–86 (1951)

    Article  MathSciNet  MATH  Google Scholar 

  20. Durbin, J.: Efficient estimation of parameters in moving-average models. Biometrika 46(3/4), 306–316 (1959)

    Article  MathSciNet  MATH  Google Scholar 

  21. Broersen, P.M.T.: Autoregressive model orders for Durbin’s MA and ARMA estimators. IEEE Transactions on Signal Processing 48(8), 2454–2457 (2000)

    Article  Google Scholar 

  22. Jones, M.C.: Randomly choosing parameters from the stationarity and invertibility regions of autoregressive-moving average models. Applied Statistics 36(2), 134–138 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  23. Hurvich, C.M., Tsai, C.L.: Regression and time series model selection in small samples. Biometrika 76(2), 297–307 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  24. Cavanaugh, J.E.: A large-sample model selection criterion based on Kullback’s symmetric divergence. Statistics & Probability Letters 42(4), 333–343 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  25. Seghouane, A.K., Bekara, M.: A small sample model selection criterion based on Kullback’s symmetric divergence. IEEE Transactions on Signal Processing 52(12), 3314–3323 (2004)

    Article  MathSciNet  Google Scholar 

  26. Broersen, P.M.T.: Automatic spectral analysis with time series models. IEEE Transactions on Instrumentation and Measurement 51(2), 211–216 (2002)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Schmidt, D.F. (2013). Minimum Message Length Order Selection and Parameter Estimation of Moving Average Models. In: Dowe, D.L. (eds) Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence. Lecture Notes in Computer Science, vol 7070. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-44958-1_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-44958-1_26

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-44957-4

  • Online ISBN: 978-3-642-44958-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics