数学
凸函数
趋同(经济学)
近端梯度法
功能(生物学)
正多边形
经验风险最小化
随机梯度下降算法
简单(哲学)
凸优化
方案(数学)
梯度法
应用数学
正定矩阵
梯度下降
算法
数学优化
计算机科学
数学分析
人工神经网络
人工智能
几何学
经济
特征向量
哲学
物理
认识论
经济增长
生物
进化生物学
量子力学
作者
Tengteng Yu,Xinwei Liu,Yu‐Hong Dai,Jie Sun
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2021-10-01
卷期号:32 (10): 4627-4638
被引量:11
标识
DOI:10.1109/tnnls.2020.3025383
摘要
We consider the problem of minimizing the sum of an average of a large number of smooth convex component functions and a possibly nonsmooth convex function that admits a simple proximal mapping. This class of problems arises frequently in machine learning, known as regularized empirical risk minimization (ERM). In this article, we propose mSRGTR-BB, a minibatch proximal stochastic recursive gradient algorithm, which employs a trust-region-like scheme to select stepsizes that are automatically computed by the Barzilai–Borwein method. We prove that mSRGTR-BB converges linearly in expectation for strongly and nonstrongly convex objective functions. With proper parameters, mSRGTR-BB enjoys a faster convergence rate than the state-of-the-art minibatch proximal variant of the semistochastic gradient method (mS2GD). Numerical experiments on standard data sets show that the performance of mSRGTR-BB is comparable to and sometimes even better than mS2GD with best-tuned stepsizes and is superior to some modern proximal stochastic gradient methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI