计算
计算机科学
二进制数
人工神经网络
算法
分类
神经计算模型
可微函数
理论计算机科学
合成数据
序列(生物学)
噪音(视频)
人工智能
数学
算术
数学分析
生物
图像(数学)
遗传学
出处
期刊:Cornell University - arXiv
日期:2016-01-01
被引量:351
标识
DOI:10.48550/arxiv.1603.08983
摘要
This paper introduces Adaptive Computation Time (ACT), an algorithm that allows recurrent neural networks to learn how many computational steps to take between receiving an input and emitting an output. ACT requires minimal changes to the network architecture, is deterministic and differentiable, and does not add any noise to the parameter gradients. Experimental results are provided for four synthetic problems: determining the parity of binary vectors, applying binary logic operations, adding integers, and sorting real numbers. Overall, performance is dramatically improved by the use of ACT, which successfully adapts the number of computational steps to the requirements of the problem. We also present character-level language modelling results on the Hutter prize Wikipedia dataset. In this case ACT does not yield large gains in performance; however it does provide intriguing insight into the structure of the data, with more computation allocated to harder-to-predict transitions, such as spaces between words and ends of sentences. This suggests that ACT or other adaptive computation methods could provide a generic method for inferring segment boundaries in sequence data.
科研通智能强力驱动
Strongly Powered by AbleSci AI