算法
计算机科学
计算复杂性理论
梯度下降
下降方向
平滑的
行搜索
迭代重建
网格
快照(计算机存储)
最小二乘函数近似
数学优化
估计员
数学
人工智能
计算机视觉
人工神经网络
统计
操作系统
计算机安全
半径
几何学
作者
Ruizhe Shi,Zhe Zhang,Xiaolan Qiu,Chibiao Ding
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:61: 1-13
标识
DOI:10.1109/tgrs.2023.3273568
摘要
This paper presents a novel efficient method for gridless line spectrum estimation problem with single snapshot and sparse signals, namely the gradient descent least squares (GDLS) method. Conventional single snapshot (a.k.a. single measure vector or SMV) line spectrum estimation methods either rely on smoothing techniques that sacrificing the range and/or azimuth resolution, or adopt the sparsity constraint and utilize compressed sensing (CS) method by defining prior grids and resulting in the off-grid problem. Recently emerged atomic norm minimization (ANM) methods achieved gridless SMV line spectrum estimation, but its computational complexity is extremely high; thus it is practically infeasible in real applications with large problem scales. Our proposed GDLS method reformulates the line spectrum estimations problem into a least squares (LS) estimation problem and solves the corresponding objective function via gradient descent algorithm in an iterative fashion with efficiency. The convergence guarantee, computational complexity, as well as performance analysis for evenly distributed antenna array case are discussed in this paper. Numerical simulations show that the proposed GDLS algorithm outperforms the state-of-the-art methods e.g., CS and ANM, in terms of estimation performances. It can completely avoid the off-grid problem, and its computational complexity is significantly lower than ANM. Our method has been tested in tomographic SAR (TomoSAR) imaging applications via simulated and real experiment data. Results show great potential of the proposed method in terms of better cloud point performance and eliminating the gridding effect.
科研通智能强力驱动
Strongly Powered by AbleSci AI