双基地雷达
啁啾声
计算机科学
雷达
脉冲多普勒雷达
匹配滤波器
连续波雷达
多普勒效应
雷达成像
滤波器(信号处理)
声学
算法
物理
光学
计算机视觉
电信
激光器
天文
作者
Tong Ding,Juan Zhang,Shiyang Tang,Linrang Zhang,Yachao Li
出处
期刊:IEEE Transactions on Geoscience and Remote Sensing
[Institute of Electrical and Electronics Engineers]
日期:2022-01-01
卷期号:60: 1-15
被引量:5
标识
DOI:10.1109/tgrs.2022.3186012
摘要
A large time-bandwidth product bistatic radar offers several advantages in high-resolution target detection and motion parameter estimation, but the scale factor and inner-pulse Doppler will also be introduced in the radar echoes when detecting high-speed targets. Under this condition, the conventional matched filter will cause a certain energy loss and a non-negligible shift of the range center, which is called mismatch effect. Considering the special geometries of the bistatic radar, we first establish the precise echo signal model to describe the radar echoes for high-speed targets in a space-air based bistatic radar system. By analyzing the mismatch effect within the pulse and the migration between pulses, we have obtained the mathematic relationship between the scale factor, inner-pulse Doppler shift, range, equivalent velocity and accelerate. Following that, we perceive that the parameters of the matched filter are related to the target motion parameters, i.e., a priori unknown, which inspires us to propose an iterative coherent integration method to achieve the intra-pulse and inter-pulse integration. A precise echo signal model matched filter with initial parameters is defined and a roughly motion parameter estimation is acquired by the precise echo signal model-based Keystone transform and inner-pulse chirp Fourier transform. The estimated parameters are used for matched filter construction and coherent integration. This process is performed over multiple iterations to provide an accurate motion parameter result. In the end, a target detection experiment is given to show the effectiveness of the proposed method using space-air based bistatic radar.
科研通智能强力驱动
Strongly Powered by AbleSci AI