偏微分方程
操作员(生物学)
人工神经网络
数学
欧几里得空间
核(代数)
函数空间
伯格斯方程
傅里叶变换
参数统计
傅里叶积分算子
应用数学
数学分析
计算机科学
算符理论
人工智能
纯数学
统计
抑制因子
化学
基因
转录因子
生物化学
作者
Zongyi Li,Nikola B. Kovachki,Kamyar Azizzadenesheli,Burigede Liu,Kaushik Bhattacharya,Andrew M. Stuart,Anima Anandkumar
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:736
标识
DOI:10.48550/arxiv.2010.08895
摘要
The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture. We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation. The Fourier neural operator is the first ML-based method to successfully model turbulent flows with zero-shot super-resolution. It is up to three orders of magnitude faster compared to traditional PDE solvers. Additionally, it achieves superior accuracy compared to previous learning-based solvers under fixed resolution.
科研通智能强力驱动
Strongly Powered by AbleSci AI