压缩传感
计算机科学
算法
奈奎斯特率
克拉姆-饶行
核(代数)
数学
贝叶斯概率
雷达
估计理论
采样(信号处理)
人工智能
电信
计算机视觉
滤波器(信号处理)
组合数学
作者
Yujie Gu,Nathan A. Goodman
标识
DOI:10.1109/tsp.2017.2706187
摘要
With the adoption of arbitrary and increasingly wideband signals, the design of modern radar systems continues to be limited by analog-to-digital converter technology and data throughput bottlenecks. Meanwhile, compressive sensing (CS) promises to reduce sampling rates below the Nyquist rate for some applications by constraining the set of possible signals. In many practical applications, detailed prior knowledge on the signals of interest can be learned from training data, existing track information, and/or other sources, which can be used to design better compressive measurement kernels. In this paper, we use an information-theoretic approach to optimize CS kernels for time delay estimation. The measurements are modeled via a Gaussian mixture model by discretizing the a priori probability distribution of the time delay. The optimal CS kernel that approximately maximizes the Shannon mutual information between the measurements and the time delay is then found by a gradient-based search. Furthermore, we also derive the Bayesian Cramér-Rao bound (CRB) for time delay estimation as a function of the CS kernel. In numerical simulations, we compare the performance of the proposed optimal sensing kernels to random projections and the Bayesian CRB. Simulation results demonstrate that the proposed technique for sensing kernel optimization can significantly improve performance, which is consistent with the Bayesian CRB versus signal-to-noise ratio (SNR). Finally, we use the Bayesian CRB expressions and simulation results to make conclusions about the usefulness of CS in radar applications. Specifically, we discuss CS SNR loss versus resolution improvement in SNR- and resolution-limited scenarios.
科研通智能强力驱动
Strongly Powered by AbleSci AI