混合模型
混合(物理)
先验概率
数学
后验概率
贝叶斯概率
Dirichlet分布
干扰素
趋同(经济学)
样本量测定
高斯分布
数学优化
应用数学
计算机科学
算法
统计
数学分析
物理
经济
量子力学
经济增长
边值问题
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:3
标识
DOI:10.48550/arxiv.2007.09284
摘要
We study Bayesian estimation of finite mixture models in a general setup where the number of components is unknown and allowed to grow with the sample size. An assumption on growing number of components is a natural one as the degree of heterogeneity present in the sample can grow and new components can arise as sample size increases, allowing full flexibility in modeling the complexity of data. This however will lead to a high-dimensional model which poses great challenges for estimation. We novelly employ the idea of a sample size dependent prior in a Bayesian model and establish a number of important theoretical results. We first show that under mild conditions on the prior, the posterior distribution concentrates around the true mixing distribution at a near optimal rate with respect to the Wasserstein distance. Under a separation condition on the true mixing distribution, we further show that a better and adaptive convergence rate can be achieved, and the number of components can be consistently estimated. Furthermore, we derive optimal convergence rates for the higher-order mixture models where the number of components diverges arbitrarily fast. In addition, we suggest a simple recipe for using Dirichlet process (DP) mixture prior for estimating the finite mixture models and provide theoretical guarantees. In particular, we provide a novel solution for adopting the number of clusters in a DP mixture model as an estimate of the number of components in a finite mixture model. Simulation study and real data applications are carried out demonstrating the utilities of our method.
科研通智能强力驱动
Strongly Powered by AbleSci AI