清晨好,您是今天最早来到科研通的研友!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您科研之路漫漫前行!

NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations

纳维-斯托克斯方程组 人工神经网络 不可压缩流 压缩性 应用数学 压力修正法 来自Navier-Stokes方程的Hagen-Poiseuille流 流量(数学) 数学 经典力学 计算机科学 物理 机械 人工智能
作者
Xiaowei Jin,Shengze Cai,Hui Li,George Em Karniadakis
出处
期刊:Journal of Computational Physics [Elsevier]
卷期号:426: 109951-109951 被引量:992
标识
DOI:10.1016/j.jcp.2020.109951
摘要

In the last 50 years there has been a tremendous progress in solving numerically the Navier-Stokes equations using finite differences, finite elements, spectral, and even meshless methods. Yet, in many real cases, we still cannot incorporate seamlessly (multi-fidelity) data into existing algorithms, and for industrial-complexity applications the mesh generation is time consuming and still an art. Moreover, solving ill-posed problems (e.g., lacking boundary conditions) or inverse problems is often prohibitively expensive and requires different formulations and new computer codes. Here, we employ physics-informed neural networks (PINNs), encoding the governing equations directly into the deep neural network via automatic differentiation, to overcome some of the aforementioned limitations for simulating incompressible laminar and turbulent flows. We develop the Navier-Stokes flow nets (NSFnets) by considering two different mathematical formulations of the Navier-Stokes equations: the velocity-pressure (VP) formulation and the vorticity-velocity (VV) formulation. Since this is a new approach, we first select some standard benchmark problems to assess the accuracy, convergence rate, computational cost and flexibility of NSFnets; analytical solutions and direct numerical simulation (DNS) databases provide proper initial and boundary conditions for the NSFnet simulations. The spatial and temporal coordinates are the inputs of the NSFnets, while the instantaneous velocity and pressure fields are the outputs for the VP-NSFnet, and the instantaneous velocity and vorticity fields are the outputs for the VV-NSFnet. This is unsupervised learning and, hence, no labeled data are required beyond boundary and initial conditions and the fluid properties. The residuals of the VP or VV governing equations, together with the initial and boundary conditions, are embedded into the loss function of the NSFnets. No data is provided for the pressure to the VP-NSFnet, which is a hidden state and is obtained via the incompressibility constraint without extra computational cost. Unlike the traditional numerical methods, NSFnets inherit the properties of neural networks (NNs), hence the total error is composed of the approximation, the optimization, and the generalization errors. Here, we empirically attempt to quantify these errors by varying the sampling ("residual") points, the iterative solvers, and the size of the NN architecture. For the laminar flow solutions, we show that both the VP and the VV formulations are comparable in accuracy but their best performance corresponds to different NN architectures. The initial convergence rate is fast but the error eventually saturates to a plateau due to the dominance of the optimization error. For the turbulent channel flow, we show that NSFnets can sustain turbulence at Reτ∼1,000, but due to expensive training we only consider part of the channel domain and enforce velocity boundary conditions on the subdomain boundaries provided by the DNS data base. We also perform a systematic study on the weights used in the loss function for balancing the data and physics components, and investigate a new way of computing the weights dynamically to accelerate training and enhance accuracy. In the last part, we demonstrate how NSFnets should be used in practice, namely for ill-posed problems with incomplete or noisy boundary conditions as well as for inverse problems. We obtain reasonably accurate solutions for such cases as well without the need to change the NSFnets and at the same computational cost as in the forward well-posed problems. We also present a simple example of transfer learning that will aid in accelerating the training of NSFnets for different parameter settings.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
思源应助ybwei2008_163采纳,获得10
7秒前
量子星尘发布了新的文献求助10
8秒前
春花完成签到,获得积分10
10秒前
Lucas应助wuxu采纳,获得10
12秒前
搜集达人应助ybwei2008_163采纳,获得10
19秒前
研友_alan完成签到 ,获得积分10
19秒前
延娜完成签到,获得积分10
27秒前
奋斗的妙海完成签到 ,获得积分0
33秒前
火火火木完成签到 ,获得积分10
34秒前
chen完成签到 ,获得积分10
35秒前
39秒前
ybheart完成签到,获得积分0
41秒前
ybwei2008_163发布了新的文献求助10
42秒前
45秒前
波里舞完成签到 ,获得积分10
48秒前
ybwei2008_163发布了新的文献求助10
50秒前
温软完成签到 ,获得积分10
53秒前
幽默滑板完成签到 ,获得积分10
1分钟前
情怀应助ybwei2008_163采纳,获得10
1分钟前
勤劳的颤完成签到 ,获得积分10
1分钟前
yuntong完成签到 ,获得积分10
1分钟前
冷冷完成签到 ,获得积分10
1分钟前
HOAN完成签到 ,获得积分10
1分钟前
luoqin完成签到 ,获得积分10
1分钟前
南宫士晋完成签到 ,获得积分10
1分钟前
景三完成签到,获得积分10
1分钟前
量子星尘发布了新的文献求助10
1分钟前
ybwei2008_163发布了新的文献求助10
1分钟前
嘟嘟52edm完成签到 ,获得积分10
1分钟前
1分钟前
wuxu发布了新的文献求助10
1分钟前
zjw完成签到 ,获得积分10
1分钟前
1分钟前
柴郡喵完成签到,获得积分10
1分钟前
ybwei2008_163发布了新的文献求助10
1分钟前
Clark完成签到,获得积分10
1分钟前
科研通AI2S应助ybwei2008_163采纳,获得10
1分钟前
2分钟前
陈艺杨完成签到 ,获得积分10
2分钟前
cheng发布了新的文献求助10
2分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Binary Alloy Phase Diagrams, 2nd Edition 8000
Building Quantum Computers 800
Translanguaging in Action in English-Medium Classrooms: A Resource Book for Teachers 700
Natural Product Extraction: Principles and Applications 500
Exosomes Pipeline Insight, 2025 500
Qualitative Data Analysis with NVivo By Jenine Beekhuyzen, Pat Bazeley · 2024 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5664650
求助须知:如何正确求助?哪些是违规求助? 4867676
关于积分的说明 15108309
捐赠科研通 4823315
什么是DOI,文献DOI怎么找? 2582234
邀请新用户注册赠送积分活动 1536272
关于科研通互助平台的介绍 1494672