行搜索
共轭梯度法
算法
梯度下降
下降方向
非线性共轭梯度法
反问题
计算机科学
收敛速度
反向
数学优化
反演(地质)
随机梯度下降算法
数学
人工神经网络
人工智能
钥匙(锁)
数学分析
几何学
古生物学
计算机安全
构造盆地
半径
生物
作者
Lian Liu,Bo Yang,Yi Zhang,Yixian Xu,Zhong Peng,Feng Wang
标识
DOI:10.1109/tgrs.2023.3239106
摘要
The nonlinear conjugate gradient (NLCG) algorithm is one of the popular linearized methods used to solve the frequency-domain electromagnetic (EM) geophysical inverse problem. During NLCG iterations, the model gradient guides the searching direction while the line-search algorithm determines the step length of each iteration. Normally, the line search requires solving the corresponding forward problem a few times. Since line search is usually computationally inefficient, we introduce the adaptive gradient descent (AGD) algorithm to accelerate solving the frequency-domain EM inverse problem within the linearized framework. The AGD algorithm is a variant of the classical gradient descent method and has been well-developed and widely used in deep learning. Rather than the time-consuming line search, its core idea is to algebraically manipulate the cumulative gradients and updates of the model from previous iterations to estimate the model parameter variables at the current iteration. For the inversion of magnetotelluric (MT) data, we here designed and implemented a framework using the AGD algorithm combined with the cool-down scheme to tune the regularization parameter. To improve the convergence performance of the AGD algorithm [specifying to Adam and root-mean-square propagation (RMSProp)], we proposed a tolerance strategy which has been tested numerically. To optimize the global learning rate, we carried out some comparative trials in the proposed inversion framework. The inverted results of synthetic and real-world data showed that both the AGD algorithms (Adam and RMSProp) can recover comparable results and save more than a third of CPU time compared with the NLCG algorithm.
科研通智能强力驱动
Strongly Powered by AbleSci AI