动力系统理论
有界函数
非线性系统
常量(计算机编程)
弹道
系列(地层学)
人工神经网络
循环神经网络
计算机科学
动力系统(定义)
常微分方程
度量(数据仓库)
微分方程
数学
应用数学
状态空间
控制理论(社会学)
人工智能
数学分析
物理
数据库
生物
统计
量子力学
古生物学
程序设计语言
控制(管理)
天文
作者
Ramin Hasani,Mathias Lechner,Alexander Amini,Daniela Rus,Radu Grosu
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2021-05-18
卷期号:35 (9): 7657-7666
被引量:101
标识
DOI:10.1609/aaai.v35i9.16936
摘要
We introduce a new class of time-continuous recurrent neural network models. Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems modulated via nonlinear interlinked gates. The resulting models represent dynamical systems with varying (i.e., liquid) time-constants coupled to their hidden state, with outputs being computed by numerical differential equation solvers. These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations, and give rise to improved performance on time-series prediction tasks. To demonstrate these properties, we first take a theoretical approach to find bounds over their dynamics, and compute their expressive power by the trajectory length measure in a latent trajectory space. We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs.
科研通智能强力驱动
Strongly Powered by AbleSci AI