循环神经网络
计算机科学
期限(时间)
序列(生物学)
人工智能
人工神经网络
选择(遗传算法)
对偶(语法数字)
分而治之算法
过程(计算)
机器学习
算法
生物
操作系统
物理
文学类
艺术
量子力学
遗传学
作者
Chenpeng Zhang,Shuai Li,Mao Ye,Ce Zhu,Xue Li
标识
DOI:10.1016/j.neucom.2021.09.043
摘要
Recurrent neural networks (RNNs) are widely used as a memory model for sequence-related problems. Many variants of RNN have been proposed to solve the gradient problems of training RNNs and process long sequences. Although some classical models have been proposed, capturing long-term dependence while responding to short-term changes remains a challenge. To address this problem, we propose a new model named Dual Recurrent Neural Networks (DuRNN). The DuRNN consists of two parts to learn the short-term dependence and progressively learn the long-term dependence. The first part is a recurrent neural network with constrained full recurrent connections to deal with short-term dependence in sequence and generate short-term memory. Another part is a recurrent neural network with independent recurrent connections which helps to learn long-term dependence and generate long-term memory. A selection mechanism is added between two parts to transfer the needed long-term information to the independent neurons. Multiple modules can be stacked to form a multi-layer model for better performance. Our contributions are: 1) a new recurrent model developed based on the divide-and-conquer strategy to learn long and short-term dependence separately, and 2) a selection mechanism to enhance the separating and learning of different temporal scales of dependence. Both theoretical analysis and extensive experiments are conducted to validate the performance of our model. Experimental results indicate that the proposed DuRNN model can handle not only very long sequences (over 5,000 time steps), but also short sequences very well.
科研通智能强力驱动
Strongly Powered by AbleSci AI