Broyden–Fletcher–Goldfarb–Shanno算法
黑森矩阵
拟牛顿法
数学
对角线的
非线性最小二乘法
水准点(测量)
加速
非线性系统
趋同(经济学)
算法
数学优化
最小二乘函数近似
应用数学
计算机科学
牛顿法
估计理论
并行计算
统计
异步通信
几何学
地理
估计员
经济
大地测量学
量子力学
计算机网络
物理
经济增长
作者
Duc Quoc Huynh,Feng Nan Hwang
标识
DOI:10.1016/j.cam.2023.115718
摘要
Our study focuses on exploring new variants of the structured quasi-Newton (SQN) method with a secant-like diagonal approximation (SLDA) of the second-order term of Hessian for solving nonlinear least squares (NLS) problems. In addition, an accelerated version of SQN-SLDA referred to as ASQN-SLDA, is also considered. In ASQN-SLDA, we rescale the search direction after the first backtracking linesearch procedure to produce a more aggressive step to increase the objective function value reduction. The concept of the proposed methods is simple and easy to implement. We prove the proposed methods are globally convergent under some appropriate assumptions and report several numerical experiments based on a suite of benchmark problems for NLS. The numerical results show that ASQN-SLDA is more robust than some baseline methods, including SQN-SLDA, the generalized Gauss–Newton (GN), Levenberg–Marquardt update, and the hybrid GN with Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, also called H-GN-BFGS. For computing time, AQN-SLDA outperforms H-GN-BFGS for most test problems, and the speedup is more significant as the problem size increases. Furthermore, due to the trade-off between the number of iterations for convergence and the overhead needed in ASQN-SLDA, the benefit of the acceleration step is more evident for the largest-sized problems.
科研通智能强力驱动
Strongly Powered by AbleSci AI