油藏计算
维数之咒
函数逼近
人工神经网络
一般化
计算机科学
背景(考古学)
趋同(经济学)
代表(政治)
班级(哲学)
回声状态网络
国家(计算机科学)
状态空间
特征(语言学)
数学
人工智能
算法
循环神经网络
数学分析
古生物学
语言学
统计
哲学
政治
政治学
法学
经济
生物
经济增长
作者
Lukas Gonon,Lyudmila Grigoryeva,Juan‐Pablo Ortega
出处
期刊:Neural Networks
[Elsevier]
日期:2024-06-22
卷期号:179: 106486-106486
被引量:1
标识
DOI:10.1016/j.neunet.2024.106486
摘要
Reservoir computing approximation and generalization bounds are proved for a new concept class of input/output systems that extends the so-called generalized Barron functionals to a dynamic context. This new class is characterized by the readouts with a certain integral representation built on infinite-dimensional state-space systems. It is shown that this class is very rich and possesses useful features and universal approximation properties. The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions. Their readouts are built using randomly generated neural networks in which only the output layer is trained (extreme learning machines or random feature neural networks). The results in the paper yield a recurrent neural network-based learning algorithm with provable convergence guarantees that do not suffer from the curse of dimensionality when learning input/output systems in the class of generalized Barron functionals and measuring the error in a mean-squared sense.
科研通智能强力驱动
Strongly Powered by AbleSci AI