先验概率
协变量
超参数
Lasso(编程语言)
贝叶斯概率
潜变量
统计
数学
正规化(语言学)
样本量测定
弹性网正则化
计量经济学
计算机科学
人工智能
算法
回归
万维网
作者
Lijin Zhang,Xinya Liang
摘要
Integrating regularization methods into structural equation modeling is gaining increasing popularity. The purpose of regularization is to improve variable selection, model estimation, and prediction accuracy. In this study, we aim to: (a) compare Bayesian regularization methods for exploring covariate effects in multiple-indicators multiple-causes models, (b) examine the sensitivity of results to hyperparameter settings of penalty priors, and (c) investigate prediction accuracy through cross-validation. The Bayesian regularization methods examined included: ridge, lasso, adaptive lasso, spike-and-slab prior (SSP) and its variants, and horseshoe and its variants. Sparse solutions were developed for the structural coefficient matrix that contained only a small portion of nonzero path coefficients characterizing the effects of selected covariates on the latent variable. Results from the simulation study showed that compared to diffuse priors, penalty priors were advantageous in handling small sample sizes and collinearity among covariates. Priors with only the global penalty (ridge and lasso) yielded higher model convergence rates and power, whereas priors with both the global and local penalties (horseshoe and SSP) provided more accurate parameter estimates for medium and large covariate effects. The horseshoe and SSP improved accuracy in predicting factor scores, while achieving more parsimonious models. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
科研通智能强力驱动
Strongly Powered by AbleSci AI