算法
水准点(测量)
趋同(经济学)
收敛速度
数学
放松(心理学)
数学优化
缩小
变量(数学)
标量(数学)
理论(学习稳定性)
计算机科学
钥匙(锁)
数学分析
几何学
心理学
社会心理学
计算机安全
大地测量学
机器学习
地理
经济
经济增长
作者
Xinyu Liu,Jie Shen,Xiangxiong Zhang
摘要
We propose in this paper a new minimization algorithm based on a slightly modified version of the scalar auxialiary variable (SAV) approach coupled with a relaxation step and an adaptive strategy. It enjoys several distinct advantages over popular gradient based methods: (i) it is unconditionally energy diminishing with a modified energy which is intrinsically related to the original energy, and thus no parameter tuning is needed for stability; (ii) it allows the use of large step-sizes which can effectively accelerate the convergence rate. We also present a convergence analysis for some SAV based algorithms, which include our new algorithm without the relaxation step as a special case. We apply our new algorithm to several illustrative and benchmark problems and compare its performance with several popular gradient based methods. The numerical results indicate that the new algorithm is very robust, and its adaptive version usually converges significantly faster than those popular gradient decent based methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI