水准点(测量)
循环神经网络
计算机科学
重置(财务)
人工智能
序列(生物学)
短时记忆
国家(计算机科学)
机器学习
人工神经网络
算法
大地测量学
生物
金融经济学
经济
遗传学
地理
作者
Felix A. Gers,Jürgen Schmidhuber,Fred Cummins
出处
期刊:Perspectives in neural computing
日期:1999-01-01
卷期号:: 133-138
被引量:15
标识
DOI:10.1007/978-1-4471-0877-1_10
摘要
Long Short-Term Memory (LSTM,[1]) can solve many tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams without explicitly marked sequence ends. Without resets, the internal state values may grow indefinitely and eventually cause the network to break down. Our remedy is an adaptive “forget gate” that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review an illustrative benchmark problem on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve a continual version of that problem. LSTM with forget gates, however, easily solves it in an elegant way.
科研通智能强力驱动
Strongly Powered by AbleSci AI