计算机科学
人工神经网络
算法
循环神经网络
人工智能
基础(线性代数)
机器学习
唤醒睡眠算法
数学
泛化误差
几何学
作者
Ronald J. Williams,David Zipser
标识
DOI:10.1162/neco.1989.1.2.270
摘要
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have (1) the advantage that they do not require a precisely defined training interval, operating while the network runs; and (2) the disadvantage that they require nonlocal communication in the network being trained and are computationally expensive. These algorithms allow networks having recurrent connections to learn complex tasks that require the retention of information over time periods having either fixed or indefinite length.
科研通智能强力驱动
Strongly Powered by AbleSci AI