边际似然
指数族
图形模型
算法
潜变量
数学
概率逻辑
贝叶斯信息准则
贝叶斯概率
数学优化
上下界
计算机科学
应用数学
统计
数学分析
作者
Matthew J. Beal,Zoubin Ghahramani
出处
期刊:Oxford University Press eBooks
[Oxford University Press]
日期:2003-07-03
卷期号:: 453-463
被引量:454
标识
DOI:10.1093/oso/9780198526155.003.0025
摘要
Abstract We present an efficient procedure for estimating the marginal likelihood of probabilistic models with latent variables or incomplete data. This method constructs and optimizes a lower bound on the marginal likelihood using variational calculus, resulting in an iterative algorithm which generalizes the EM algorithm by maintaining posterior distributions over both latent variables and parameters. We define the family of conjugate-exponential models—which includes finite mixtures of exponential family models, factor analysis, hidden Markov models, linear state-space models, and other models of interest—for which this bound on the marginal likelihood can be computed very simply through a modification of the standard EM algorithm. In particular, we focus on applying these bounds to the problem of scoring discrete directed graphical model structures (Bayesian networks). Extensive simulations comparing the variational bounds to the usual approach based on the Bayesian Information Criterion (BIC) and to a sampling-based gold standard method known as Annealed Importance Sampling (AIS) show that variational bounds substantially outperform BIC in finding the correct model structure at relatively little computational cost, while approaching the performance of the much more costly AIS procedure. Using AIS allows us to provide the first serious case study of the tightness of variational bounds. We also analyze the performance of AIS through a variety of criteria, and outline directions in which this work can be extended.
科研通智能强力驱动
Strongly Powered by AbleSci AI