可扩展性
计算机科学
操作员(生物学)
核(代数)
人工智能
人工神经网络
操作系统
数学
离散数学
化学
生物化学
抑制因子
转录因子
基因
作者
Matthew Lowery,Janet J. Turnage,Zachary Morrow,John Jakeman,Akil Narayan,Shandian Zhe,Varun Shankar
出处
期刊:Cornell University - arXiv
日期:2024-06-30
被引量:3
标识
DOI:10.48550/arxiv.2407.00809
摘要
This paper introduces the Kernel Neural Operator (KNO), a novel operator learning technique that uses deep kernel-based integral operators in conjunction with quadrature for function-space approximation of operators (maps from functions to functions). KNOs use parameterized, closed-form, finitely-smooth, and compactly-supported kernels with trainable sparsity parameters within the integral operators to significantly reduce the number of parameters that must be learned relative to existing neural operators. Moreover, the use of quadrature for numerical integration endows the KNO with geometric flexibility that enables operator learning on irregular geometries. Numerical results demonstrate that on existing benchmarks the training and test accuracy of KNOs is higher than popular operator learning techniques while using at least an order of magnitude fewer trainable parameters. KNOs thus represent a new paradigm of low-memory, geometrically-flexible, deep operator learning, while retaining the implementation simplicity and transparency of traditional kernel methods from both scientific computing and machine learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI