凸性
次线性函数
数学
趋同(经济学)
数学优化
核(代数)
均方误差
超参数
计算机科学
财产(哲学)
功能(生物学)
算法
应用数学
统计
组合数学
金融经济学
经济
经济增长
哲学
认识论
进化生物学
生物
作者
Xijun Liang,Zhipeng Zhang,Yong Song,Ling Jian
标识
DOI:10.1016/j.ejor.2021.05.002
摘要
Typical online learning methods have brought fruitful achievements based on the framework of online convex optimization. Meanwhile, nonconvex loss functions also received numerous attentions for their merits of noise-resiliency and sparsity. Current nonconvex loss functions are typically designed as smooth for the ease of designing the optimization algorithms. However, these loss functions no longer have the property of sparse support vectors. In this work, we focus on regression with a special type of nonconvex loss function (i.e., canal loss), and propose a kernel-based online regression algorithm, n̲oise-r̲esilient o̲nline r̲egression (NROR), to deal with the noisy labels. The canal loss is a type of horizontally truncated loss and has the merit of sparsity. Although the canal loss is nonconvex and nonsmooth, the regularized canal loss has a property similar to convexity which is called strong pseudo-convexity. Furthermore, the sublinear regret bound of NROR is proved under certain assumptions. Experimental studies show that NROR achieves low prediction errors in terms of mean absolute error and root mean squared error on the datasets of heavy noisy labels. Particularly, we check whether the convergence assumption strictly holds in practice and find that the assumptions required for convergence are rarely violated, and the convergence rate is not affected.
科研通智能强力驱动
Strongly Powered by AbleSci AI