Fang Zhou,Xiangxiang Xu,Xin Li,Huizhen Yang,Chenglong Gong
出处
期刊:AOPC 2020: Optical Sensing and Imaging Technology日期:2020-11-05卷期号:163650: 174-174被引量:3
标识
DOI:10.1117/12.2579991
摘要
The SPGD (stochastic parallel gradient descent) is a control algorithm widely used in WFSless (wavefront sensorless) AO (adaptive optics) system.The gain is commonly set to a fixed value in the traditional SPGD algorithm.With the increase of the number of DM (deformable mirror) actuators, the optimization space of the algorithm becomes larger, which can easily lead to the slow convergence speed of the algorithm and the rise of the probability of falling into the local optima. Adam(Adaptive Moment Estimation) optimizer is an optimized stochastic gradient descent algorithm commonly used in deep learning with the advantage of achieving adaptive gain. Wavefront aberrations under different turbulence strength as correction objects, WFSless AO systems are built with 32, 61, 97 and 127 elements DM as wavefront corrector respectively. Results show that the optimized algorithm can converge faster than basic SPGD and the probability of falling into local optima decreases. The system's convergence speed is increased by about 30%. The probability of falling into local optima is decreased by 29.8% , 30.3% , 32.6% and 35.9% respectively under D/r0=5 and by 28.8% , 30.5% , 33.3% and 34.5% respectively under D/r0=15. The advantages of the optimized algorithm are more obvious as the number of DM actuators increases. Above results provide a theoretical basis for the practical application of the SPGD algorithm based on Adam optimization.