函数逼近
近似误差
人工神经网络
多项式的
连续模量
单位球
平滑度
非线性系统
数学
对数
线性近似
功能(生物学)
极大极小近似算法
空格(标点符号)
应用数学
数学分析
计算机科学
人工智能
物理
类型(生物学)
生物
进化生物学
操作系统
量子力学
生态学
作者
Linhao Song,Ying Liu,Jun Fan,Ding‐Xuan Zhou
标识
DOI:10.1016/j.neunet.2023.07.012
摘要
In recent years, deep neural networks have been employed to approximate nonlinear continuous functionals F defined on Lp([-1,1]s) for 1≤p≤∞. However, the existing theoretical analysis in the literature either is unsatisfactory due to the poor approximation results, or does not apply to the rectified linear unit (ReLU) activation function. This paper aims to investigate the approximation power of functional deep ReLU networks in two settings: F is continuous with restrictions on the modulus of continuity, and F has higher order Fréchet derivatives. A novel functional network structure is proposed to extract features of higher order smoothness harbored by the target functional F. Quantitative rates of approximation in terms of the depth, width and total number of weights of neural networks are derived for both settings. We give logarithmic rates when measuring the approximation error on the unit ball of a Hölder space. In addition, we establish nearly polynomial rates (i.e., rates of the form exp-a(logM)b with a>0,0
科研通智能强力驱动
Strongly Powered by AbleSci AI