数学
下降(航空)
趋同(经济学)
背景(考古学)
应用数学
坐标下降
下降方向
最小二乘函数近似
反问题
数学优化
简单(哲学)
反向
线性最小二乘法
随机梯度下降算法
梯度下降
线性模型
计算机科学
数学分析
统计
人工智能
哲学
估计员
航空航天工程
经济增长
工程类
生物
古生物学
几何学
认识论
人工神经网络
经济
作者
Dirk A. Lorenz,Felix Schneppe,Lionel Tondji
出处
期刊:Inverse Problems
[IOP Publishing]
日期:2023-11-02
卷期号:39 (12): 125019-125019
标识
DOI:10.1088/1361-6420/ad08ed
摘要
Abstract We consider the problem of solving linear least squares problems in a framework where only evaluations of the linear map are possible. We derive randomized methods that do not need any other matrix operations than forward evaluations, especially no evaluation of the adjoint map is needed. Our method is motivated by the simple observation that one can get an unbiased estimate of the application of the adjoint. We show convergence of the method and then derive a more efficient method that uses an exact linesearch. This method, called random descent, resembles known methods in other context and has the randomized coordinate descent method as special case. We provide convergence analysis of the random descent method emphasizing the dependence on the underlying distribution of the random vectors. Furthermore we investigate the applicability of the method in the context of ill-posed inverse problems and show that the method can have beneficial properties when the unknown solution is rough. We illustrate the theoretical findings in numerical examples. One particular result is that the random descent method actually outperforms established transposed-free methods (TFQMR and CGS) in examples.
科研通智能强力驱动
Strongly Powered by AbleSci AI