Abstract
Typically, the first step in carrying out predictions is to develop an inductive model. In many instances, the best model is used, and it is often selected based on certain relevant criteria that may possibly ignore some of the model uncertainties. The Bayesian approach to model selection uses a weighted average of a class of models thereby overcoming some of the uncertainties associated with selecting the best model. This approach is referred to in the literature as the Bayesian Model Averaging (BMA) approach. It turns out that this approach has significant overlap with the theory of Algorithmic Probability (ALP) developed by R. J. Solomonoff in the early 1960s. The purpose of this article is to first highlight this connection by applying ALP to a set of nested stationary autoregressive time series models, and to give an algorithm to compute “relative weights” of models. This reveals, although empirically, a model weight that can be compared with the Schwarz Bayesian Selection criterion (BIC or SIC). We then develop an elementary algorithm of the Monte Carlo type to evaluate multidimensional integrals over stability domains, and use it to compute what we call the “trimmed weights”.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Akaike, H.: A new look at statistical model identification. IEEE Trans. Aut. Control 19, 716–723 (1974)
Akaike, H.: Information measures and model selection. Bull. of Int. Stat. Inst. 50, 277–290 (1983)
Burnham, K.P., Anderson, D.R.: Model selection and multimodel inference: a practical information-theoretical approach, 2nd edn. Springer, N.Y. (2002)
Brandoff-Nielsen, O., Schou, G.: On the parametrization of autoregressive models by partial autocorrelations. J. Multivar. Anal. 3, 408–419 (1973)
Chen, C., Davis, R.A., Brockwell, J.P., Bai, Z.D.: Order determination for autoregressive processes using resampling method. Stat. Sinica 3, 481–500 (1993)
Fam, A.T.: The volume of the coefficient space stability domain of monic polynomials. In: IEEE Int. Symp. Circuits and Systems, Portland, Oregon, vol. 2, pp. 1780–1783 (1989)
Fitzgibbon, L.J., Dowe, D., Vahid, F.: Minimum message length autoregressive model order selection. In: Palanaswami, M., Chandra Sekhar, C., Kumar Venayagamoorthy, G., Mohan, S., Ghantasala, M.K. (eds.) International Conference on Intelligent Sensing and Information Processing (ICISIP), Chennai, India, January 4-7, pp. 439–444 (2004)
Hutter, M.: Algorithmic information theory. Scholarpedia 2(3), 2519 (2007)
Hoeting, J.A., Madigan, D., Raftery, A.E., Volinsky, C.T.: Bayesian Model Averaging: A Tutorial. Statistical Science 14, 382–401 (1999)
Hurvich, C.M., Tsai, C.-L.: Regression and time series model selection in small samples. Biometrika 76, 297–397 (1989)
Jones, M.C.: Randomly choosing parameters from the stationary and invertibility regions of autoregressive-moving average models. J. Roy. Stat. Soc., Series C (Appl. Stat.) 36, 134–138 (1987)
Knuth, D.: The art of computer programming, Volume 2: Seminumerical Algorithms, 3rd edn. Addison-Wesley (1997)
Kass, R.E., Raftery, A.E.: Bayes Factors. J. Amer. Stat. Assoc. 90, 773–795 (1995)
Liang, F., Barron, A.: Minimax strategies for predictive density estimation, data compression, and model selection. IEEE Trans. Info. Th. 50, 2708–2726 (2004)
Li, M., Vitányi, P.: An introduction to Kolomogorov complexity and its applications. Springer, N.Y. (1997)
Makhoul, J.: Linear prediction: a tutorial review. Proc. IEEE 63, 561–580 (1975)
Monahan, J.F.: A note on enforcing stationarity in autoregressive-moving average models. Biometrika 71, 403–404 (1984)
Nikolaev, Y.P.: The multidimensional asymptotic stability domain of linear discrete systems: Its symmetry and other properties. Aut. and Rem. Control 62, 109–120 (2001)
Piccolo, D.: The size of the stationarity and invertibility region of an autoregressive-moving average process. J. of Time Series Analysis 3, 245–247 (1982)
Rissanen, J.: Modeling by the shortest data description. Automatica 14, 465–471 (1978)
Schwarz, G.: Estimating the dimension of a model. Ann. of Stat. 6, 461–464 (1978)
Shlien, S.: A Geometric description of stable linear predictive coding digital filters. IEEE Trans. Info. Th. 31, 545–548 (1985)
Solomonoff, R.J.: A preliminary report on general theory of inductive inference (1960)
Solomonoff, R.J.: A formal theory of inductive inference. Inform. and Control, Part I, 1-22, Part II 7, 224–254 (1964)
Solomonoff, R.J.: The discovery of algorithmic probability. J. Comp. & Sys. Sci. 55, 73–88 (1997)
Solomonoff, R.J.: Algorithmic Probability: Theory and Applications, Revision of article. In: Emmert-Streib, F., Dehmer, M. (eds.) Information Theory and Statistical Learning, pp. 1–23. Springer Science+Business Media, N.Y. (2009)
Wallace, C.S., Boulton, D.M.: An information measure for classification. Comput. J. 11, 185–194 (1968)
Wallace, C.S., Dowe, D.L.: Minimum Message Length and Kolmogorov Complexity. Computer J. 42, 270–283 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Solomonoff, R.J., Saleeby, E.G. (2013). On the Application of Algorithmic Probability to Autoregressive Models. In: Dowe, D.L. (eds) Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence. Lecture Notes in Computer Science, vol 7070. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-44958-1_29
Download citation
DOI: https://doi.org/10.1007/978-3-642-44958-1_29
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-44957-4
Online ISBN: 978-3-642-44958-1
eBook Packages: Computer ScienceComputer Science (R0)