NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations

纳维-斯托克斯方程组 人工神经网络 不可压缩流 压缩性 斯托克斯流 应用数学 压力修正法 来自Navier-Stokes方程的Hagen-Poiseuille流 流量(数学) 数学 经典力学 计算机科学 数学分析 物理 机械 人工智能
作者
Xiaowei Jin,Shengze Cai,Hui Li,George Em Karniadakis
出处
期刊:Journal of Computational Physics [Elsevier]
卷期号:426: 109951-109951 被引量:649
标识
DOI:10.1016/j.jcp.2020.109951
摘要

In the last 50 years there has been a tremendous progress in solving numerically the Navier-Stokes equations using finite differences, finite elements, spectral, and even meshless methods. Yet, in many real cases, we still cannot incorporate seamlessly (multi-fidelity) data into existing algorithms, and for industrial-complexity applications the mesh generation is time consuming and still an art. Moreover, solving ill-posed problems (e.g., lacking boundary conditions) or inverse problems is often prohibitively expensive and requires different formulations and new computer codes. Here, we employ physics-informed neural networks (PINNs), encoding the governing equations directly into the deep neural network via automatic differentiation, to overcome some of the aforementioned limitations for simulating incompressible laminar and turbulent flows. We develop the Navier-Stokes flow nets (NSFnets) by considering two different mathematical formulations of the Navier-Stokes equations: the velocity-pressure (VP) formulation and the vorticity-velocity (VV) formulation. Since this is a new approach, we first select some standard benchmark problems to assess the accuracy, convergence rate, computational cost and flexibility of NSFnets; analytical solutions and direct numerical simulation (DNS) databases provide proper initial and boundary conditions for the NSFnet simulations. The spatial and temporal coordinates are the inputs of the NSFnets, while the instantaneous velocity and pressure fields are the outputs for the VP-NSFnet, and the instantaneous velocity and vorticity fields are the outputs for the VV-NSFnet. This is unsupervised learning and, hence, no labeled data are required beyond boundary and initial conditions and the fluid properties. The residuals of the VP or VV governing equations, together with the initial and boundary conditions, are embedded into the loss function of the NSFnets. No data is provided for the pressure to the VP-NSFnet, which is a hidden state and is obtained via the incompressibility constraint without extra computational cost. Unlike the traditional numerical methods, NSFnets inherit the properties of neural networks (NNs), hence the total error is composed of the approximation, the optimization, and the generalization errors. Here, we empirically attempt to quantify these errors by varying the sampling ("residual") points, the iterative solvers, and the size of the NN architecture. For the laminar flow solutions, we show that both the VP and the VV formulations are comparable in accuracy but their best performance corresponds to different NN architectures. The initial convergence rate is fast but the error eventually saturates to a plateau due to the dominance of the optimization error. For the turbulent channel flow, we show that NSFnets can sustain turbulence at Reτ∼1,000, but due to expensive training we only consider part of the channel domain and enforce velocity boundary conditions on the subdomain boundaries provided by the DNS data base. We also perform a systematic study on the weights used in the loss function for balancing the data and physics components, and investigate a new way of computing the weights dynamically to accelerate training and enhance accuracy. In the last part, we demonstrate how NSFnets should be used in practice, namely for ill-posed problems with incomplete or noisy boundary conditions as well as for inverse problems. We obtain reasonably accurate solutions for such cases as well without the need to change the NSFnets and at the same computational cost as in the forward well-posed problems. We also present a simple example of transfer learning that will aid in accelerating the training of NSFnets for different parameter settings.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
2秒前
zhaoyingxin发布了新的文献求助10
3秒前
5秒前
6秒前
Jeffery完成签到,获得积分10
9秒前
kento发布了新的文献求助100
10秒前
希勤发布了新的文献求助10
10秒前
11秒前
11秒前
JXY发布了新的文献求助10
13秒前
六沉完成签到,获得积分10
14秒前
风车车完成签到,获得积分10
14秒前
一一应助平淡的谷兰采纳,获得10
17秒前
19秒前
星辰大海应助云_123采纳,获得10
21秒前
21秒前
22秒前
六沉发布了新的文献求助10
22秒前
Amor发布了新的文献求助10
25秒前
robi发布了新的文献求助10
26秒前
29秒前
锦哥完成签到,获得积分20
30秒前
Amor完成签到,获得积分10
34秒前
云_123发布了新的文献求助10
34秒前
嘟嘟嘟嘟完成签到 ,获得积分10
36秒前
Muller完成签到 ,获得积分10
36秒前
甜甜若血完成签到,获得积分10
39秒前
可爱以冬完成签到 ,获得积分10
39秒前
41秒前
Jasper应助爱幻想的青柠采纳,获得10
44秒前
纪俊完成签到,获得积分20
45秒前
47秒前
今后应助YI点半的飞机场采纳,获得10
47秒前
坦率完成签到,获得积分10
47秒前
50秒前
乐乐应助1592541采纳,获得10
51秒前
无一完成签到 ,获得积分10
51秒前
57秒前
yao完成签到,获得积分10
59秒前
1分钟前
高分求助中
Sustainability in Tides Chemistry 2800
The Young builders of New china : the visit of the delegation of the WFDY to the Chinese People's Republic 1000
Rechtsphilosophie 1000
Bayesian Models of Cognition:Reverse Engineering the Mind 888
Le dégorgement réflexe des Acridiens 800
Defense against predation 800
Very-high-order BVD Schemes Using β-variable THINC Method 568
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3134943
求助须知:如何正确求助?哪些是违规求助? 2785830
关于积分的说明 7774354
捐赠科研通 2441699
什么是DOI,文献DOI怎么找? 1298104
科研通“疑难数据库(出版商)”最低求助积分说明 625079
版权声明 600825