判别式
潜变量
数学
推论
指数族
上下界
贝叶斯概率
边距(机器学习)
简单(哲学)
最大化
期望最大化算法
计算机科学
数学优化
应用数学
人工智能
机器学习
统计
最大似然
数学分析
哲学
认识论
作者
Tony Jebara,Alex Pentland
出处
期刊:Neural Information Processing Systems
日期:2000-01-01
卷期号:13: 231-237
被引量:41
摘要
Jensen's inequality is a powerful mathematical tool and one of the workhorses in statistical learning. Its applications therein include the EM algorithm, Bayesian estimation and Bayesian inference. Jensen computes simple lower bounds on otherwise intractable quantities such as products of sums and latent log-likelihoods. This simplification then permits operations like integration and maximization. Quite often (i.e. in discriminative learning) upper bounds are needed as well. We derive and prove an efficient analytic inequality that provides such variational upper bounds. This inequality holds for latent variable mixtures of exponential family distributions and thus spans a wide range of contemporary statistical models. We also discuss applications of the upper bounds including maximum conditional likelihood, large margin discriminative models and conditional Bayesian inference. Convergence, efficiency and prediction results are shown.
科研通智能强力驱动
Strongly Powered by AbleSci AI