自编码
可逆矩阵
还原(数学)
维数(图论)
降维
图像(数学)
数学
人工智能
计算机科学
算法
模式识别(心理学)
纯数学
几何学
人工神经网络
作者
Yimin Yang,Q. M. Jonathan Wu,Yaonan Wang
出处
期刊:IEEE transactions on systems, man, and cybernetics
[Institute of Electrical and Electronics Engineers]
日期:2016-12-24
卷期号:48 (7): 1065-1079
被引量:89
标识
DOI:10.1109/tsmc.2016.2637279
摘要
The extreme learning machine (ELM), which was originally proposed for "generalized" single-hidden layer feedforward neural networks, provides efficient unified learning solutions for the applications of regression and classification. Although, it provides promising performance and robustness and has been used for various applications, the single-layer architecture possibly lacks the effectiveness when applied for natural signals. In order to over come this shortcoming, the following work indicates a new architecture based on multilayer network framework. The significant contribution of this paper are as follows: 1) unlike existing multilayer ELM, in which hidden nodes are obtained randomly, in this paper all hidden layers with invertible functions are calculated by pulling the network output back and putting it into hidden layers. Thus, the feature learning is enriched by additional information, which results in better performance; 2) in contrast to the existing multilayer network methods, which are usually efficient for classification applications, the proposed architecture is implemented for dimension reduction and image reconstruction; and 3) unlike other iterative learning-based deep networks (DL), the hidden layers of the proposed method are obtained via four steps. Therefore, it has much better learning efficiency than DL. Experimental results on 33 datasets indicate that, in comparison to the other existing dimension reduction techniques, the proposed method performs competitively better with fast training speeds.
科研通智能强力驱动
Strongly Powered by AbleSci AI