偏微分方程
梯度下降
流量(数学)
应用数学
人工神经网络
数学优化
计算机科学
算法
物理
数学
数学分析
人工智能
机械
作者
Xiaojian Li,Yuhao Liu,Zhengxian Liu
出处
期刊:Physics of Fluids
[American Institute of Physics]
日期:2023-06-01
卷期号:35 (6)
被引量:9
摘要
Physics-informed neural network (PINN) is an emerging technique for solving partial differential equations (PDEs) of flow problems. Due to the advantage of low computational cost, the gradient descent algorithms coupled with the weighted objectives method are usually used to optimize loss functions in the PINN training. However, the interaction mechanisms between gradients of loss functions are not fully clarified, leading to poor performances in loss functions optimization. For this, an adaptive gradient descent algorithm (AGDA) is proposed based on the interaction mechanisms analyses and then validated by analytical PDEs and flow problems. First, the interaction mechanisms of loss functions gradients in the PINN training based on the traditional Adam optimizer are analyzed. The main factors responsible for the poor performances of the Adam optimizer are identified. Then, a new AGDA optimizer is developed for the PINN training by two modifications: (1) balancing the magnitude difference of loss functions gradients and (2) eliminating the gradient directions conflict. Finally, three types of PDEs (elliptic, hyperbolic, and parabolic) and four viscous incompressible flow problems are selected to validate the proposed algorithm. It is found that to reach the specified accuracy, the required training time of the AGDA optimizer is about 16%–90% of the Adam optimizer and 41%–64% of the PCGrad optimizer, and the demanded number of iterations is about 10%–68% of the Adam optimizer and 38%–77% of the PCGrad optimizer. Therefore, the PINN method coupled with the AGDA optimizer is a more efficient and robust technique for solving partial differential equations of flow problems.
科研通智能强力驱动
Strongly Powered by AbleSci AI