守恒定律
人工神经网络
维数之咒
参数统计
数学
近似误差
一般化
应用数学
标量(数学)
算法
计算机科学
人工智能
数学分析
统计
几何学
作者
Tim De Ryck,Siddhartha Mishra
摘要
We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments.
科研通智能强力驱动
Strongly Powered by AbleSci AI