阿卡克信息准则
贝叶斯信息准则
选型
信息标准
口译(哲学)
最小描述长度
贝叶斯概率
统计假设检验
统计模型
选择(遗传算法)
数学
维数(图论)
计算机科学
拟合优度
数据挖掘
计量经济学
统计
人工智能
程序设计语言
纯数学
作者
Joseph E. Cavanaugh,Andrew A. Neath
摘要
The Akaike information criterion (AIC) is one of the most ubiquitous tools in statistical modeling. The first model selection criterion to gain widespread acceptance, AIC was introduced in 1973 by Hirotugu Akaike as an extension to the maximum likelihood principle. Maximum likelihood is conventionally applied to estimate the parameters of a model once the structure and dimension of the model have been formulated. Akaike's seminal idea was to combine into a single procedure the process of estimation with structural and dimensional determination. This article reviews the conceptual and theoretical foundations for AIC, discusses its properties and its predictive interpretation, and provides a synopsis of important practical issues pertinent to its application. Comparisons and delineations are drawn between AIC and its primary competitor, the Bayesian information criterion (BIC). In addition, the article covers refinements of AIC for settings where the asymptotic conditions and model specification assumptions that underlie the justification of AIC may be violated. This article is categorized under: Software for Computational Statistics > Artificial Intelligence and Expert Systems Statistical Models > Model Selection Statistical and Graphical Methods of Data Analysis > Modeling Methods and Algorithms Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods
科研通智能强力驱动
Strongly Powered by AbleSci AI