降维
线性子空间
分类器(UML)
计算机科学
高维数据聚类
外部产品
非参数统计
高维
切片逆回归
随机投影
维数(图论)
人工智能
模式识别(心理学)
投影(关系代数)
数学
数据挖掘
算法
统计
张量积
组合数学
聚类分析
纯数学
几何学
作者
Zhenzhai Cai,Yingcun Xia,Weiqiang Hang
标识
DOI:10.1080/01621459.2021.2003202
摘要
Sufficient dimension reduction (SDR) has progressed steadily. However, its ability to improve general function estimation or classification has not been well received, especially for high-dimensional data. In this article, we first devise a local linear smoother for high dimensional nonparametric regression and then utilise it in the outer-product-of-gradient (OPG) approach of SDR. We call the method high-dimensional OPG (HOPG). To apply SDR to classification in high-dimensional data, we propose an ensemble classifier by aggregating results of classifiers that are built on subspaces reduced by the random projection and HOPG consecutively from the data. Asymptotic results for both HOPG and the classifier are established. Superior performance over the existing methods is demonstrated in simulations and real data analyses. Supplementary materials for this article are available online.
科研通智能强力驱动
Strongly Powered by AbleSci AI