主成分分析
降维
人工神经网络
冗余(工程)
维数(图论)
非线性系统
实施
模式识别(心理学)
计算机科学
稀疏PCA
还原(数学)
人工智能
代表(政治)
数据缩减
算法
数学
数据挖掘
政治
操作系统
物理
量子力学
政治学
程序设计语言
法学
纯数学
几何学
作者
Nandakishore Kambhatla,Todd K. Leen
标识
DOI:10.1162/neco.1997.9.7.1493
摘要
Reducing or eliminating statistical redundancy between the components of high-dimensional vector data enables a lower-dimensional representation without significant loss of information. Recognizing the limitations of principal component analysis (PCA), researchers in the statistics and neural network communities have developed nonlinear extensions of PCA. This article develops a local linear approach to dimension reduction that provides accurate representations and is fast to compute. We exercise the algorithms on speech and image data, and compare performance with PCA and with neural network implementations of nonlinear PCA. We find that both nonlinear techniques can provide more accurate representations than PCA and show that the local linear techniques outperform neural network implementations.
科研通智能强力驱动
Strongly Powered by AbleSci AI