计算机科学
算法
伯努利原理
潜变量
吉布斯抽样
比例(比率)
可扩展性
因子(编程语言)
潜变量模型
变量(数学)
数学优化
数学
人工智能
贝叶斯概率
工程类
数学分析
航空航天工程
物理
数据库
程序设计语言
量子力学
作者
Siliang Zhang,Yunxiao Chen,Yang Liu
摘要
In this paper, we explore the use of the stochastic EM algorithm (Celeux & Diebolt (1985) Computational Statistics Quarterly , 2, 73) for large‐scale full‐information item factor analysis. Innovations have been made on its implementation, including an adaptive‐rejection‐based Gibbs sampler for the stochastic E step, a proximal gradient descent algorithm for the optimization in the M step, and diagnostic procedures for determining the burn‐in size and the stopping of the algorithm. These developments are based on the theoretical results of Nielsen (2000, Bernoulli , 6 , 457), as well as advanced sampling and optimization techniques. The proposed algorithm is computationally efficient and virtually tuning‐free, making it scalable to large‐scale data with many latent traits (e.g. more than five latent traits) and easy to use for practitioners. Standard errors of parameter estimation are also obtained based on the missing‐information identity (Louis, 1982, Journal of the Royal Statistical Society, Series B , 44 , 226). The performance of the algorithm is evaluated through simulation studies and an application to the analysis of the IPIP ‐ NEO personality inventory. Extensions of the proposed algorithm to other latent variable models are discussed.
科研通智能强力驱动
Strongly Powered by AbleSci AI