凸性
收敛速度
趋同(经济学)
计算机科学
正多边形
简单(哲学)
缩小
凸函数
数学优化
应用数学
数学
算法
钥匙(锁)
哲学
认识论
金融经济学
经济增长
计算机安全
经济
几何学
作者
Lam M. Nguyen,Jie Liu,Katya Scheinberg,Martin Takáč
出处
期刊:Cornell University - arXiv
日期:2017-01-01
被引量:274
标识
DOI:10.48550/arxiv.1703.00102
摘要
In this paper, we propose a StochAstic Recursive grAdient algoritHm (SARAH), as well as its practical variant SARAH+, as a novel approach to the finite-sum minimization problems. Different from the vanilla SGD and other modern stochastic methods such as SVRG, S2GD, SAG and SAGA, SARAH admits a simple recursive framework for updating stochastic gradient estimates; when comparing to SAG/SAGA, SARAH does not require a storage of past gradients. The linear convergence rate of SARAH is proven under strong convexity assumption. We also prove a linear convergence rate (in the strongly convex case) for an inner loop of SARAH, the property that SVRG does not possess. Numerical experiments demonstrate the efficiency of our algorithm.
科研通智能强力驱动
Strongly Powered by AbleSci AI