泛化误差
经验风险最小化
力矩(物理)
一般化
统计学习理论
计算机科学
回归
缩小
趋同(经济学)
透视图(图形)
噪音(视频)
应用数学
数学
收敛速度
计量经济学
数学优化
人工智能
机器学习
统计
理论(学习稳定性)
支持向量机
经济增长
数学分析
频道(广播)
经济
物理
计算机网络
图像(数学)
经典力学
出处
期刊:Neurocomputing
[Elsevier BV]
日期:2022-10-01
卷期号:507: 191-198
被引量:4
标识
DOI:10.1016/j.neucom.2022.08.012
摘要
In this paper, we study the performance of robust learning with Huber loss. As an alternative to traditional empirical risk minimization schemes, Huber regression has been extensively used in machine learning. A new comparison theorem is established in the paper, which characterizes the gap between the excess generalization error and the prediction error. In addition, we refine the error bounds from the perspective of statistical learning theory and improve the convergence rates in the presence of heavy-tailed noise. It is worth mentioning that a new moment condition E[|Y|1+∊|X=x]∈LρX2 is employed in analysis of error bound and learning rates from a theoretical viewpoint.
科研通智能强力驱动
Strongly Powered by AbleSci AI