计算机科学
后悔
循环神经网络
趋同(经济学)
人工智能
回归
简单(哲学)
机器学习
集合(抽象数据类型)
人工神经网络
期限(时间)
算法
统计
数学
哲学
物理
经济
认识论
程序设计语言
量子力学
经济增长
作者
N. Mert Vural,Fatih İlhan,Selim F. Yilmaz,Salih Ergüt,Süleyman S. Kozat
标识
DOI:10.1109/tnnls.2021.3086029
摘要
Recurrent neural networks (RNNs) are widely used for online regression due to their ability to generalize nonlinear temporal dependencies. As an RNN model, long short-term memory networks (LSTMs) are commonly preferred in practice, as these networks are capable of learning long-term dependencies while avoiding the vanishing gradient problem. However, due to their large number of parameters, training LSTMs requires considerably longer training time compared to simple RNNs (SRNNs). In this article, we achieve the online regression performance of LSTMs with SRNNs efficiently. To this end, we introduce a first-order training algorithm with a linear time complexity in the number of parameters. We show that when SRNNs are trained with our algorithm, they provide very similar regression performance with the LSTMs in two to three times shorter training time. We provide strong theoretical analysis to support our experimental results by providing regret bounds on the convergence rate of our algorithm. Through an extensive set of experiments, we verify our theoretical work and demonstrate significant performance improvements of our algorithm with respect to LSTMs and the other state-of-the-art learning models.
科研通智能强力驱动
Strongly Powered by AbleSci AI