操作员(生物学)
人工神经网络
计算机科学
杠杆(统计)
常微分方程
背景(考古学)
偏微分方程
人工智能
微分方程
数学
数学分析
转录因子
生物
基因
古生物学
生物化学
抑制因子
化学
作者
Liu Yang,Siting Liu,Tingwei Meng,Stanley Osher
出处
期刊:Cornell University - arXiv
日期:2023-04-17
被引量:1
标识
DOI:10.1073/pnas.2310142120
摘要
This paper introduces a new neural-network-based approach, namely In-Context Operator Networks (ICON), to simultaneously learn operators from the prompted data and apply it to new questions during the inference stage, without any weight update. Existing methods are limited to using a neural network to approximate a specific equation solution or a specific operator, requiring retraining when switching to a new problem with different equations. By training a single neural network as an operator learner, we can not only get rid of retraining (even fine-tuning) the neural network for new problems, but also leverage the commonalities shared across operators so that only a few demos in the prompt are needed when learning a new operator. Our numerical results show the neural network's capability as a few-shot operator learner for a diversified type of differential equation problems, including forward and inverse problems of ordinary differential equations (ODEs), partial differential equations (PDEs), and mean-field control (MFC) problems, and also show that it can generalize its learning capability to operators beyond the training distribution.
科研通智能强力驱动
Strongly Powered by AbleSci AI