散列函数
人工神经网络
计算机科学
编码(内存)
超参数
稳健性(进化)
理论计算机科学
哈希表
自动微分
人工智能
算法
生物化学
化学
计算机安全
计算
基因
作者
Xinquan Huang,Tariq Alkhalifah
标识
DOI:10.1016/j.jcp.2024.112760
摘要
Physics-informed neural networks (PINNs) have attracted much attention in scientific computing as their functional representation of partial differential equation (PDE) solutions offers flexibility and accuracy features. However, their training cost has limited their practical use as a real alternative to classic numerical methods. Thus, we propose to incorporate multi-resolution hash encoding into PINNs to improve the training efficiency, as such encoding offers a locally-aware (at multi-resolution) coordinate input to the neural network. Borrowed from the neural representation field community (NeRF), we investigate the robustness of calculating the derivatives of such hash-encoded neural networks with respect to the input coordinates, which is often needed by the PINN loss terms. We propose to replace the automatic differentiation with finite-difference calculations of the derivatives to address the discontinuous nature of such derivatives. We also share the appropriate ranges for the hash encoding hyperparameters to obtain robust derivatives. We test the proposed method on several benchmark problems, and the proposed method admits about a 10-fold improvement in efficiency over the vanilla PINN implementation.
科研通智能强力驱动
Strongly Powered by AbleSci AI