计算机科学
特征(语言学)
人工智能
人工神经网络
水准点(测量)
机器学习
深度学习
重新使用
非线性系统
计算机工程
生态学
哲学
语言学
物理
大地测量学
量子力学
生物
地理
作者
Xiaochuan Sun,Guan Gui,Yingqi Li,Ren Ping Liu,Yongli An
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2018-07-06
卷期号:6 (1): 679-691
被引量:85
标识
DOI:10.1109/jiot.2018.2853663
摘要
Deep neural networks (DNNs) have widely used in various Internet-of-Things (IoT) applications. Pursuing superior performance is always a hot spot in the field of DNN modeling. Recently, feature reuse provides an effective means of achieving favorable nonlinear approximation performance in deep learning. Existing implementations utilizes a multilayer perception (MLP) to act as a functional unit for feature reuse. However, determining connection weight and bias of MLP is a rather intractable problem, since the conventional back-propagation learning approach encounters the limitations of slow convergence and local optimum. To address this issue, this paper develops a novel DNN considering a well-behaved alternative called reservoir computing, i.e., reservoir in network (ResInNet). In this structure, the built-in reservoir has two notable functions. First, it behaves as a bridge between any two restricted Boltzmann machines in the feature learning part of ResInNet, performing a feature abstraction once again. Such reservoir-based feature translation provides excellent starting points for the following nonlinear regression. Second, it serves as a nonlinear approximation, trained by a simple linear regression using the most representative (learned) features. Experimental results over various benchmark datasets show that ResInNet can achieve the superior nonlinear approximation performance in comparison to the baseline models, and produce the excellent dynamic characteristics and memory capacity. Meanwhile, the merits of our approach is further demonstrated in the network traffic prediction related to real-world IoT application.
科研通智能强力驱动
Strongly Powered by AbleSci AI