NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations

纳维-斯托克斯方程组 人工神经网络 不可压缩流 压缩性 应用数学 压力修正法 来自Navier-Stokes方程的Hagen-Poiseuille流 流量(数学) 数学 经典力学 计算机科学 物理 机械 人工智能
作者
Xiaowei Jin,Shengze Cai,Hui Li,George Em Karniadakis
出处
期刊:Journal of Computational Physics [Elsevier]
卷期号:426: 109951-109951 被引量:963
标识
DOI:10.1016/j.jcp.2020.109951
摘要

In the last 50 years there has been a tremendous progress in solving numerically the Navier-Stokes equations using finite differences, finite elements, spectral, and even meshless methods. Yet, in many real cases, we still cannot incorporate seamlessly (multi-fidelity) data into existing algorithms, and for industrial-complexity applications the mesh generation is time consuming and still an art. Moreover, solving ill-posed problems (e.g., lacking boundary conditions) or inverse problems is often prohibitively expensive and requires different formulations and new computer codes. Here, we employ physics-informed neural networks (PINNs), encoding the governing equations directly into the deep neural network via automatic differentiation, to overcome some of the aforementioned limitations for simulating incompressible laminar and turbulent flows. We develop the Navier-Stokes flow nets (NSFnets) by considering two different mathematical formulations of the Navier-Stokes equations: the velocity-pressure (VP) formulation and the vorticity-velocity (VV) formulation. Since this is a new approach, we first select some standard benchmark problems to assess the accuracy, convergence rate, computational cost and flexibility of NSFnets; analytical solutions and direct numerical simulation (DNS) databases provide proper initial and boundary conditions for the NSFnet simulations. The spatial and temporal coordinates are the inputs of the NSFnets, while the instantaneous velocity and pressure fields are the outputs for the VP-NSFnet, and the instantaneous velocity and vorticity fields are the outputs for the VV-NSFnet. This is unsupervised learning and, hence, no labeled data are required beyond boundary and initial conditions and the fluid properties. The residuals of the VP or VV governing equations, together with the initial and boundary conditions, are embedded into the loss function of the NSFnets. No data is provided for the pressure to the VP-NSFnet, which is a hidden state and is obtained via the incompressibility constraint without extra computational cost. Unlike the traditional numerical methods, NSFnets inherit the properties of neural networks (NNs), hence the total error is composed of the approximation, the optimization, and the generalization errors. Here, we empirically attempt to quantify these errors by varying the sampling ("residual") points, the iterative solvers, and the size of the NN architecture. For the laminar flow solutions, we show that both the VP and the VV formulations are comparable in accuracy but their best performance corresponds to different NN architectures. The initial convergence rate is fast but the error eventually saturates to a plateau due to the dominance of the optimization error. For the turbulent channel flow, we show that NSFnets can sustain turbulence at Reτ∼1,000, but due to expensive training we only consider part of the channel domain and enforce velocity boundary conditions on the subdomain boundaries provided by the DNS data base. We also perform a systematic study on the weights used in the loss function for balancing the data and physics components, and investigate a new way of computing the weights dynamically to accelerate training and enhance accuracy. In the last part, we demonstrate how NSFnets should be used in practice, namely for ill-posed problems with incomplete or noisy boundary conditions as well as for inverse problems. We obtain reasonably accurate solutions for such cases as well without the need to change the NSFnets and at the same computational cost as in the forward well-posed problems. We also present a simple example of transfer learning that will aid in accelerating the training of NSFnets for different parameter settings.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
李华完成签到,获得积分10
1秒前
1秒前
Eid完成签到,获得积分10
2秒前
unravel发布了新的文献求助10
2秒前
舒先生完成签到,获得积分10
3秒前
zzz完成签到,获得积分10
3秒前
浮游应助传统的小伙采纳,获得10
3秒前
彭于晏应助传统的小伙采纳,获得10
3秒前
浮游应助WStarry采纳,获得10
4秒前
Dr发布了新的文献求助10
4秒前
bingbing完成签到,获得积分10
4秒前
汪金发布了新的文献求助20
4秒前
123发布了新的文献求助10
5秒前
5秒前
LiliHe发布了新的文献求助10
5秒前
赘婿应助dream采纳,获得10
5秒前
小杭76应助026采纳,获得10
5秒前
6秒前
aiomn完成签到,获得积分10
7秒前
科研通AI6应助Dr采纳,获得10
7秒前
GYJ完成签到 ,获得积分10
7秒前
多冰去糖完成签到 ,获得积分10
8秒前
顾矜应助老福贵儿采纳,获得10
8秒前
风行域完成签到 ,获得积分10
8秒前
瑶瑶酱发布了新的文献求助10
9秒前
ding应助快乐的素采纳,获得10
10秒前
思源应助renshi647采纳,获得10
10秒前
Emma发布了新的文献求助10
10秒前
远方发布了新的文献求助10
12秒前
12秒前
小马甲应助青筠采纳,获得10
12秒前
aujsdhab完成签到,获得积分10
14秒前
Afliea发布了新的文献求助10
16秒前
LiliHe完成签到,获得积分10
18秒前
18秒前
完美世界应助mmmmm采纳,获得30
19秒前
20秒前
研友_VZG7GZ应助DreamMaker采纳,获得10
20秒前
量子星尘发布了新的文献求助10
23秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Social Work Ethics Casebook: Cases and Commentary (revised 2nd ed.).. Frederic G. Reamer 1070
Alloy Phase Diagrams 1000
Introduction to Early Childhood Education 1000
2025-2031年中国兽用抗生素行业发展深度调研与未来趋势报告 1000
List of 1,091 Public Pension Profiles by Region 891
Historical Dictionary of British Intelligence (2014 / 2nd EDITION!) 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5424649
求助须知:如何正确求助?哪些是违规求助? 4539035
关于积分的说明 14164752
捐赠科研通 4456058
什么是DOI,文献DOI怎么找? 2444033
邀请新用户注册赠送积分活动 1435127
关于科研通互助平台的介绍 1412469