序列(生物学)
人工神经网络
偏微分方程
趋同(经济学)
应用数学
椭圆偏微分方程
功能(生物学)
数学
订单(交换)
抛物型偏微分方程
计算机科学
数学分析
人工智能
经济增长
遗传学
进化生物学
生物
经济
财务
出处
期刊:Communications in Computational Physics
[Global Science Press]
日期:2020-04-03
卷期号:28 (5): 2042-2074
被引量:79
标识
DOI:10.4208/cicp.oa-2020-0193
摘要
Physics informed neural networks (PINNs) are deep learning based techniques for solving partial differential equations (PDEs) encounted in computational science and engineering. Guided by data and physical laws, PINNs find a neural network that approximates the solution to a system of PDEs. Such a neural network is obtained by minimizing a loss function in which any prior knowledge of PDEs and data are encoded. Despite its remarkable empirical success in one, two or three dimensional problems, there is little theoretical justification for PINNs. As the number of data grows, PINNs generate a sequence of minimizers which correspond to a sequence of neural networks. We want to answer the question: Does the sequence of minimizers converge to the solution to the PDE? We consider two classes of PDEs: linear second-order elliptic and parabolic. By adapting the Schauder approach and the maximum principle, we show that the sequence of minimizers strongly converges to the PDE solution in $C^0$. Furthermore, we show that if each minimizer satisfies the initial/boundary conditions, the convergence mode becomes $H^1$. Computational examples are provided to illustrate our theoretical findings. To the best of our knowledge, this is the first theoretical work that shows the consistency of PINNs.
科研通智能强力驱动
Strongly Powered by AbleSci AI