随机微分方程
最大值和最小值
应用数学
布朗运动
计算机科学
极限(数学)
解算器
微分方程
数学
统计物理学
数学优化
数学分析
物理
统计
作者
Patrick Kidger,James Foster,Xuechen Li,Harald Oberhauser,Terry Lyons
出处
期刊:Cornell University - arXiv
日期:2021-01-01
被引量:25
标识
DOI:10.48550/arxiv.2102.03657
摘要
Stochastic differential equations (SDEs) are a staple of mathematical modelling of temporal dynamics. However, a fundamental limitation has been that such models have typically been relatively inflexible, which recent work introducing Neural SDEs has sought to solve. Here, we show that the current classical approach to fitting SDEs may be approached as a special case of (Wasserstein) GANs, and in doing so the neural and classical regimes may be brought together. The input noise is Brownian motion, the output samples are time-evolving paths produced by a numerical solver, and by parameterising a discriminator as a Neural Controlled Differential Equation (CDE), we obtain Neural SDEs as (in modern machine learning parlance) continuous-time generative time series models. Unlike previous work on this problem, this is a direct extension of the classical approach without reference to either prespecified statistics or density functions. Arbitrary drift and diffusions are admissible, so as the Wasserstein loss has a unique global minima, in the infinite data limit any SDE may be learnt. Example code has been made available as part of the \texttt{torchsde} repository.
科研通智能强力驱动
Strongly Powered by AbleSci AI