数学
对角线的
近端梯度法
公制(单位)
静止点
应用数学
趋同(经济学)
凸函数
功能(生物学)
收敛速度
变量(数学)
差异(会计)
数学优化
正多边形
数学分析
计算机科学
钥匙(锁)
几何学
计算机安全
业务
会计
经济
运营管理
生物
进化生物学
经济增长
作者
Tengteng Yu,Xinwei Liu,Yu‐Hong Dai,Jie Sun
摘要
<p style='text-indent:20px;'>We study the problem of minimizing the sum of two functions. The first function is the average of a large number of nonconvex component functions and the second function is a convex (possibly nonsmooth) function that admits a simple proximal mapping. With a diagonal Barzilai-Borwein stepsize for updating the metric, we propose a variable metric proximal stochastic variance reduced gradient method in the mini-batch setting, named VM-SVRG. It is proved that VM-SVRG converges sublinearly to a stationary point in expectation. We further suggest a variant of VM-SVRG to achieve linear convergence rate in expectation for nonconvex problems satisfying the proximal Polyak-Łojasiewicz inequality. The complexity of VM-SVRG is lower than that of the proximal gradient method and proximal stochastic gradient method, and is the same as the proximal stochastic variance reduced gradient method. Numerical experiments are conducted on standard data sets. Comparisons with other advanced proximal stochastic gradient methods show the efficiency of the proposed method.</p>
科研通智能强力驱动
Strongly Powered by AbleSci AI