自适应步长
趋同(经济学)
二次方程
对偶(语法数字)
正多边形
数学优化
凸函数
二次函数
数学
应用数学
功能(生物学)
计算机科学
数学分析
数值分析
几何学
艺术
文学类
进化生物学
经济
生物
经济增长
作者
Yüan Shen,Chang Liu,Yannian Zuo,Xingying Zhang
摘要
Dual ascent method (DAM) is an effective method for solving linearly constrained convex optimization problems. Classical DAM converges extremely slowly due to its small stepsize, and it has been improved through relaxing the stepsize condition and introducing the self-adaptive stepsize rule by He et al, which increases its convergence speed. In this paper, we further relax its stepsize condition whereas the convergence result can still be guaranteed, providing the objective function is quadratic. We show the encouraging performance of the new DAM with new stepsize condition via the experiments on both synthetic and real problems.
科研通智能强力驱动
Strongly Powered by AbleSci AI