非线性自回归外生模型
计算机科学
循环神经网络
自回归模型
系列(地层学)
对偶(语法数字)
时间序列
人工智能
人工神经网络
机器学习
计量经济学
数学
艺术
古生物学
文学类
生物
作者
Yao Qin,Dongjin Song,Haifeng Chen,Wei Cheng,Guofei Jiang,Garrison W. Cottrell
出处
期刊:Cornell University - arXiv
日期:2017-01-01
被引量:369
标识
DOI:10.48550/arxiv.1704.02971
摘要
The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series, has been studied for decades. Despite the fact that various NARX models have been developed, few of them can capture the long-term temporal dependencies appropriately and select the relevant driving series to make predictions. In this paper, we propose a dual-stage attention-based recurrent neural network (DA-RNN) to address these two issues. In the first stage, we introduce an input attention mechanism to adaptively extract relevant driving series (a.k.a., input features) at each time step by referring to the previous encoder hidden state. In the second stage, we use a temporal attention mechanism to select relevant encoder hidden states across all time steps. With this dual-stage attention scheme, our model can not only make predictions effectively, but can also be easily interpreted. Thorough empirical studies based upon the SML 2010 dataset and the NASDAQ 100 Stock dataset demonstrate that the DA-RNN can outperform state-of-the-art methods for time series prediction.
科研通智能强力驱动
Strongly Powered by AbleSci AI