Abstract
This paper presents a novel approach to estimating a moving average model of unknown order from an observed time series based on the minimum message length principle (MML). The nature of the exact Fisher information matrix for moving average models leads to problems when used in the standard Wallace–Freeman message length approximation, and this is overcome by utilising the asymptotic form of the information matrix. By exploiting the link between partial autocorrelations and invertible moving average coefficients an efficient procedure for finding the MML moving average coefficient estimates is derived. The MML estimating equations are shown to be free of solutions at the boundary of the invertibility region that result in the troublesome “pile-up” effect in maximum likelihood estimation. Simulations demonstrate the excellent performance of the MML criteria in comparison to standard moving average inference procedures in terms of both parameter estimation and order selection, particularly for small sample sizes.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Akaike, H.: A new look at the statistical model identification. IEEE Transactions on Automatic Control 19(6), 716–723 (1974)
Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6(2), 461–464 (1978)
Wallace, C.S.: Statistical and Inductive Inference by Minimum Message Length, 1st edn. Information Science and Statistics. Springer (2005)
Sak, M., Dowe, D., Ray, S.: Minimum message length moving average time series data mining. In: Proceedings of the ICSC Congress on Computational Intelligence Methods and Applications (ACFM 2005), Istanbul, Turkey (2005)
Schmidt, D.F.: Minimum Message Length Inference of Autoregressive Moving Average Models. PhD thesis, Clayton School of Information Technology, Monash University (2008)
Solomonoff, R.J.: A formal theory of inductive inference. Information and Control 7(2), 1–22, 224–254 (1964)
Farr, G.E., Wallace, C.S.: The complexity of strict minimum message length inference. Computer Journal 45(3), 285–292 (2002)
Wallace, C.S., Freeman, P.R.: Estimation and inference by compact coding. Journal of the Royal Statistical Society (Series B) 49(3), 240–252 (1987)
Porat, B., Friedlander, B.: Computation of the exact information matrix of Gaussian time series with stationary random components. IEEE Transactions on Acoustics, Speech and Signal Processing 34(1), 118–130 (1986)
Gardner, G., Harvey, A.C., Phillips, G.D.A.: Algorithm AS 154: An algorithm for exact maximum likelihood estimation of autoregressive-moving average models by means of Kalman filtering. Applied Statistics 29(3), 311–322 (1980)
Fitzgibbon, L.J., Dowe, D.L., Vahid, F.: Minimum message length autoregressive model order selection. In: Proceedings of the International Conference on Intelligent Sensing and Information Processing (ICISIP), pp. 439–444 (2004)
Piccolo, D.: The size of the stationarity and invertibility region of an autoregressive moving average process. Journal of Time Series Analysis 3(4), 245–247 (1982)
Makalic, E., Schmidt, D.F.: Fast computation of the Kullback-Leibler divergence and exact Fisher information for the first-order moving average model. IEEE Signal Processing Letters 17(4), 391–393 (2009)
Whittle, P.: The analysis of multiple stationary time series. Journal of the Royal Statistical Society, Series B (Methodological) 15(1), 125–139 (1953)
Barndorff-Nielsen, O., Schou, G.: On the parametrization of autoregressive models by partial autocorrelations. Journal of Multivariate Analysis 3, 408–419 (1973)
Haughton, D.M.A.: On the choice of a model to fit data from an exponential family. The Annals of Statistics 16(1), 342–355 (1988)
Rissanen, J., Caines, P.E.: The strong consistency of maximum likelihood estimators for ARMA processes. The Annals of Statistics 7(2), 297–315 (1979)
Davidson, J.E.H.: Problems with the estimation of moving average processes. Journal of Econometrics 16(3), 295–310 (1981)
Kullback, S., Leibler, R.A.: On information and sufficiency. The Annals of Mathematical Statistics 22(1), 79–86 (1951)
Durbin, J.: Efficient estimation of parameters in moving-average models. Biometrika 46(3/4), 306–316 (1959)
Broersen, P.M.T.: Autoregressive model orders for Durbin’s MA and ARMA estimators. IEEE Transactions on Signal Processing 48(8), 2454–2457 (2000)
Jones, M.C.: Randomly choosing parameters from the stationarity and invertibility regions of autoregressive-moving average models. Applied Statistics 36(2), 134–138 (1987)
Hurvich, C.M., Tsai, C.L.: Regression and time series model selection in small samples. Biometrika 76(2), 297–307 (1989)
Cavanaugh, J.E.: A large-sample model selection criterion based on Kullback’s symmetric divergence. Statistics & Probability Letters 42(4), 333–343 (1999)
Seghouane, A.K., Bekara, M.: A small sample model selection criterion based on Kullback’s symmetric divergence. IEEE Transactions on Signal Processing 52(12), 3314–3323 (2004)
Broersen, P.M.T.: Automatic spectral analysis with time series models. IEEE Transactions on Instrumentation and Measurement 51(2), 211–216 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Schmidt, D.F. (2013). Minimum Message Length Order Selection and Parameter Estimation of Moving Average Models. In: Dowe, D.L. (eds) Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence. Lecture Notes in Computer Science, vol 7070. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-44958-1_26
Download citation
DOI: https://doi.org/10.1007/978-3-642-44958-1_26
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-44957-4
Online ISBN: 978-3-642-44958-1
eBook Packages: Computer ScienceComputer Science (R0)