降维
非参数统计
人工智能
代表(政治)
维数(图论)
水准点(测量)
非参数回归
一般化
机器学习
内在维度
外部数据表示
背景(考古学)
计算机科学
深度学习
特征学习
维数之咒
数学
统计
大地测量学
数学分析
古生物学
政治
生物
法学
纯数学
地理
政治学
作者
Jian Huang,Yuling Jiao,Xu Liao,Jin Liu,Zhou Yu
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:3
标识
DOI:10.48550/arxiv.2006.05865
摘要
The goal of supervised representation learning is to construct effective data representations for prediction. Among all the characteristics of an ideal nonparametric representation of high-dimensional complex data, sufficiency, low dimensionality and disentanglement are some of the most essential ones. We propose a deep dimension reduction approach to learning representations with these characteristics. The proposed approach is a nonparametric generalization of the sufficient dimension reduction method. We formulate the ideal representation learning task as that of finding a nonparametric representation that minimizes an objective function characterizing conditional independence and promoting disentanglement at the population level. We then estimate the target representation at the sample level nonparametrically using deep neural networks. We show that the estimated deep nonparametric representation is consistent in the sense that its excess risk converges to zero. Our extensive numerical experiments using simulated and real benchmark data demonstrate that the proposed methods have better performance than several existing dimension reduction methods and the standard deep learning models in the context of classification and regression.
科研通智能强力驱动
Strongly Powered by AbleSci AI