Softmax函数
人工智能
聚类分析
深度学习
计算机科学
特征学习
子空间拓扑
深层神经网络
模式识别(心理学)
熵(时间箭头)
机器学习
交叉熵
特征(语言学)
人工神经网络
高维数据聚类
物理
哲学
量子力学
语言学
作者
Sheng Wu,Wei‐Shi Zheng
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2021-01-25
卷期号:33 (2): 774-788
被引量:8
标识
DOI:10.1109/tnnls.2020.3029033
摘要
While feature learning by deep neural networks is currently widely used, it is still very challenging to perform this task, given the very limited quantity of labeled data. To solve this problem, we propose to unite subspace clustering with deep semisupervised feature learning to form a unified learning framework to pursue feature learning by subspace clustering. More specifically, we develop a deep entropy-sparsity subspace clustering (deep ESSC) model, which forces a deep neural network to learn features using subspace clustering constrained by our designed entropy-sparsity scheme. The model can inherently harmonize deep semisupervised feature learning and subspace clustering simultaneously by the proposed self-similarity preserving strategy. To optimize the deep ESSC model, we introduce two unconstrained variables to eliminate the two constraints via softmax functions. We provide a general algebraic-treatment scheme for solving the proposed deep ESSC model. Extensive experiments with comprehensive analysis substantiate that our deep ESSC model is more effective than the related methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI