足够的尺寸缩减
维数(图论)
切片逆回归
数学
降维
核(代数)
还原(数学)
数学优化
最小二乘函数近似
方差减少
有效尺寸
差异(会计)
算法
应用数学
回归
计算机科学
统计
人工智能
组合数学
几何学
会计
业务
估计员
豪斯多夫维数
蒙特卡罗方法
纯数学
作者
Min Cai,Ruige Zhuang,Zhou Yu,Ping Wu
出处
期刊:Stat
[Wiley]
日期:2022-01-15
卷期号:11 (1)
摘要
Sufficient dimension reduction is intended to project high‐dimensional predictors onto a low‐dimensional space without loss of information on the responses. Classical methods, such as sliced inverse regression, sliced average variance estimation and directional regression, are backbones of many modern sufficient dimension methods and have gained considerable research interests. However, the efficiency of such methods will be shrunk when dealing with sparse models. Under given models or some strict sparsity assumptions, there are existing sparse sufficient dimension methods in the literature. In order to relax the model assumptions and sparsity, in this paper, we define a general least squares objective function, which is applicable to all kernel matrices of classical sufficient dimension reduction methods, and propose a Mallows model averaging based sufficient dimension reduction method. Furthermore, an iterative least squares algorithm is used to obtain the sample estimates. Our method demonstrates excellent performance in simulation results.
科研通智能强力驱动
Strongly Powered by AbleSci AI