Skip to main content

Akaike’s Information Criterion: Background, Derivation, Properties, and Refinements

  • Reference work entry
  • First Online:
International Encyclopedia of Statistical Science

Introduction

The Akaike Information Criterion, AIC, was introduced by Hirotogu Akaike in his seminal 1973 paper “Information Theory and an Extension of the Maximum Likelihood Principle.” AIC was the first model selection criterion to gain widespread attention in the statistical community. Today, AIC continues to be the most widely known and used model selection tool among practitioners.

The traditional maximum likelihood paradigm, as applied to statistical modeling, provides a mechanism for estimating the unknown parameters of a model having a specified dimension and structure. Akaike extended this paradigm by considering a framework in which the model dimension is also unknown, and must therefore be determined from the data. Thus, Akaike proposed a framework wherein both model estimation and selection could be simultaneously accomplished.

For a parametric candidate model of interest, the likelihood function reflects the conformity of the model to the observed data. As the complexity...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 1,100.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 549.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References and Further Readings

  • Akaike H (1973) Information theory and an extension of the maximum likelihood principle. In: Petrov BN, Csáki F (eds) Proceedings of the 2nd International symposium on information theory. Akadémia Kiadó, Budapest, pp 267–281

    Google Scholar 

  • Akaike H (1974) A new look at the statistical model identification. IEEE T Automat Contra AC-19:716–723

    MATH  MathSciNet  Google Scholar 

  • Bengtsson T, Cavanaugh JE (2006) An improved Akaike information criterion for state-space model selection. Comput Stat Data An 50:2635–2654

    MATH  MathSciNet  Google Scholar 

  • Bozdogan H (1987) Model selection and Akaike’s information criterion (AIC): the general theory and its analytical extensions. Psychometrika 52:345–370

    MATH  MathSciNet  Google Scholar 

  • Cavanaugh JE, Shumway RH (1997) A bootstrap variant of AIC for state-space model selection. Stat Sinica 7:473–496

    MATH  MathSciNet  Google Scholar 

  • Davies SL, Neath AA, Cavanaugh JE (2005) Cross validation model selection criteria for linear regression based on the Kullback-Leibler discrepancy. Stat Methodol 2:249–266

    MATH  MathSciNet  Google Scholar 

  • Hurvich CM, Shumway RH, Tsai CL (1990) Improved estimators of Kullback-Leibler information for autoregressive model selection in small samples. Biometrika 77:709–719

    MathSciNet  Google Scholar 

  • Hurvich CM, Tsai CL (1989) Regression and time series model selection in small samples. Biometrika 76:297–307

    MATH  MathSciNet  Google Scholar 

  • Ishiguro M, Sakamoto Y, Kitagawa G (1997) Bootstrapping log likelihood and EIC, an extension of AIC. Ann I Stat Math 49:411–434

    MATH  MathSciNet  Google Scholar 

  • Konishi S, Kitagawa G (1996) Generalised information criteria in model selection. Biometrika 83:875–890

    MATH  MathSciNet  Google Scholar 

  • Kullback S (1968) Information Theory and Statistics. Dover, New York

    Google Scholar 

  • Kullback S, Leibler RA (1951) On information and sufficiency. Ann Math Stat 22:76–86

    MathSciNet  Google Scholar 

  • Linhart H, Zucchini W (1986) Model selection. Wiley, New York

    MATH  Google Scholar 

  • Pan W (2001) Akaike’s information criterion in generalized estimating equations. Biometrics 57:120–125

    MATH  MathSciNet  Google Scholar 

  • Shibata R (1980) Asymptotically efficient selection of the order of the model for estimating parameters of a linear process. Ann Stat 80:147–164

    Google Scholar 

  • Shibata R (1981) An optimal selection of regression variables. Biometrika 68:45–54

    MATH  MathSciNet  Google Scholar 

  • Shibata R (1997)Bootstrap estimate of Kullback-Leibler information for model selection. Stat Sinica 7:375–394

    MATH  Google Scholar 

  • Stone M (1977) An asymptotic equivalence of choice of model by cross-validation and Akaike’s criterion. J R Stat Soc B 39:44–47

    MATH  Google Scholar 

  • Sugiura N (1978) Further analysis of the data by Akaike’s information criterion and the finite corrections. Commun Stat A7:13–26

    MATH  MathSciNet  Google Scholar 

  • Takeuchi K (1976) Distribution of information statistics and criteria for adequacy of models. Mathematical Sciences 153:12–18(in Japanese)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this entry

Cite this entry

Cavanaugh, J.E., Neath, A.A. (2011). Akaike’s Information Criterion: Background, Derivation, Properties, and Refinements. In: Lovric, M. (eds) International Encyclopedia of Statistical Science. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04898-2_111

Download citation

Publish with us

Policies and ethics