计算机科学
人工神经网络
变压器
灵活性(工程)
深度学习
人工智能
偏微分方程
一般化
深层神经网络
感知器
算法
数学
工程类
数学分析
统计
电压
电气工程
作者
Leo Zhiyuan Zhao,Xueying Ding,B. Aditya Prakash
出处
期刊:Cornell University - arXiv
日期:2023-01-01
被引量:6
标识
DOI:10.48550/arxiv.2307.11833
摘要
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs). However, conventional PINNs, relying on multilayer perceptrons (MLP), neglect the crucial temporal dependencies inherent in practical physics systems and thus fail to propagate the initial condition constraints globally and accurately capture the true solutions under various scenarios. In this paper, we introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation. PINNsFormer can accurately approximate PDE solutions by utilizing multi-head attention mechanisms to capture temporal dependencies. PINNsFormer transforms point-wise inputs into pseudo sequences and replaces point-wise PINNs loss with a sequential loss. Additionally, it incorporates a novel activation function, Wavelet, which anticipates Fourier decomposition through deep neural networks. Empirical results demonstrate that PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs. Moreover, PINNsFormer offers flexibility in integrating existing learning schemes for PINNs, further enhancing its performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI