欧几里得空间
数学
核(代数)
离散化
偏微分方程
人工神经网络
非线性系统
操作员(生物学)
图形
背景(考古学)
应用数学
计算机科学
域代数上的
离散数学
数学分析
纯数学
人工智能
生物
基因
物理
转录因子
古生物学
抑制因子
量子力学
化学
生物化学
作者
Zongyi Li,Nikola B. Kovachki,Kamyar Azizzadenesheli,Burigede Liu,Kaushik Bhattacharya,Andrew M. Stuart,Anima Anandkumar
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:130
标识
DOI:10.48550/arxiv.2003.03485
摘要
The classical development of neural networks has been primarily for mappings between a finite-dimensional Euclidean space and a set of classes, or between two finite-dimensional Euclidean spaces. The purpose of this work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators). The key innovation in our work is that a single set of network parameters, within a carefully designed network architecture, may be used to describe mappings between infinite-dimensional spaces and between different finite-dimensional approximations of those spaces. We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators. The kernel integration is computed by message passing on graph networks. This approach has substantial practical consequences which we will illustrate in the context of mappings between input data to partial differential equations (PDEs) and their solutions. In this context, such learned networks can generalize among different approximation methods for the PDE (such as finite difference or finite element methods) and among approximations corresponding to different underlying levels of resolution and discretization. Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
科研通智能强力驱动
Strongly Powered by AbleSci AI