回归
人工神经网络
计算机科学
深层神经网络
人工智能
计量经济学
机器学习
数学
统计
作者
Yuling Jiao,Yang Wang,Bokai Yan
出处
期刊:Cornell University - arXiv
日期:2024-09-09
标识
DOI:10.48550/arxiv.2409.05577
摘要
We study the approximation capacity of deep ReLU recurrent neural networks (RNNs) and explore the convergence properties of nonparametric least squares regression using RNNs. We derive upper bounds on the approximation error of RNNs for H\"older smooth functions, in the sense that the output at each time step of an RNN can approximate a H\"older function that depends only on past and current information, termed a past-dependent function. This allows a carefully constructed RNN to simultaneously approximate a sequence of past-dependent H\"older functions. We apply these approximation results to derive non-asymptotic upper bounds for the prediction error of the empirical risk minimizer in regression problem. Our error bounds achieve minimax optimal rate under both exponentially $\beta$-mixing and i.i.d. data assumptions, improving upon existing ones. Our results provide statistical guarantees on the performance of RNNs.
科研通智能强力驱动
Strongly Powered by AbleSci AI