Jianglin Shi,Rongzhi Zhang,Shiping Guo,Yikang Yang,Rongbin Xu,Wei Niu,Jisheng Li
出处
期刊:Optical Engineering [SPIE - International Society for Optical Engineering] 日期:2019-09-17卷期号:58 (09): 1-1被引量:6
标识
DOI:10.1117/1.oe.58.9.093102
摘要
To obtain sharp images of space targets, high-accuracy restoration of degraded images corrected by an adaptive optics (AO) system is necessary. Existing algorithms are mainly based on the physical constraints of both image and point-spread function (PSF), which are usually continuously estimated in an alternately iterative manner and take a long time to restore blurred images. We propose an end-to-end blind restoration method for ground-based space target images based on conditional generative adversarial network without estimating PSF. The whole network consists of two parts, generator network and discriminator network, which are used for learning the atmospheric degradation process and achieving the purpose of generating restored images. To train the network, a simulated AO image dataset containing 4800 sharp–blur image pairs is constructed by 80 three-dimensional models of space targets combined with degradation of atmospheric turbulence. Experimental results demonstrate that the proposed method not only enhances the restoration accuracy but also improves the restoration efficiency of single-frame object images.