加速度
加速
稳健性(进化)
计算机科学
梯度下降
趋同(经济学)
算法
最大化
缩小
数学优化
数学
人工智能
并行计算
生物化学
人工神经网络
经典力学
基因
经济增长
物理
经济
化学
作者
Bo‐Hao Tang,Nicholas C. Henderson,Ravi Varadhan
摘要
Fixed-point algorithms are popular in statistics and data science due to their simplicity, guaranteed convergence, and applicability to high-dimensional problems. Well-known examples include the expectation-maximization (EM) algorithm, majorization-minimization (MM), and gradient-based algorithms like gradient descent (GD) and proximal gradient descent. A characteristic weakness of these algorithms is their slow convergence. We discuss several state-of-art techniques for accelerating their convergence. We demonstrate and evaluate these techniques in terms of their efficiency and robustness in six distinct applications. Among the acceleration schemes, SQUAREM shows robust acceleration with a mean 18-fold speedup. DAAREM and restarted-Nesterov schemes also demonstrate consistently impressive accelerations. Thus, it is possible to accelerate the original fixed-point algorithm by using one of SQUAREM, DAAREM, or restarted-Nesterov acceleration schemes. We describe implementation details and software packages to facilitate the application of the acceleration schemes. We also discuss strategies for selecting a particular acceleration scheme for a given problem.
科研通智能强力驱动
Strongly Powered by AbleSci AI