卷积神经网络
计算机科学
波前
波前传感器
质心
像素
自适应光学
人工智能
帧速率
计算机视觉
物理
光学
作者
Mitchell Grose,Jason D. Schmidt,Keigo Hirakawa
出处
期刊:Applied Optics
[The Optical Society]
日期:2024-04-01
卷期号:63 (16): E35-E35
被引量:5
摘要
Shack-Hartmann wavefront sensing is a technique for measuring wavefront aberrations, whose use in adaptive optics relies on fast position tracking of an array of spots. These sensors conventionally use frame-based cameras operating at a fixed sampling rate to report pixel intensities, even though only a fraction of the pixels have signal. Prior in-lab experiments have shown feasibility of event-based cameras for Shack-Hartmann wavefront sensing (SHWFS), asynchronously reporting the spot locations as log intensity changes at a microsecond time scale. In our work, we propose a convolutional neural network (CNN) called event-based wavefront network (EBWFNet) that achieves highly accurate estimation of the spot centroid position in real time. We developed a custom Shack-Hartmann wavefront sensing hardware with a common aperture for the synchronized frame- and event-based cameras so that spot centroid locations computed from the frame-based camera may be used to train/test the event-CNN-based centroid position estimation method in an unsupervised manner. Field testing with this hardware allows us to conclude that the proposed EBWFNet achieves sub-pixel accuracy in real-world scenarios with substantial improvement over the state-of-the-art event-based SHWFS. An ablation study reveals the impact of data processing, CNN components, and training cost function; and an unoptimized MATLAB implementation is shown to run faster than 800 Hz on a single GPU.
科研通智能强力驱动
Strongly Powered by AbleSci AI