Jack Saywell,Max Carey,Nikolaos Dedes,Ilya Kuprov,Tim Freegarde
标识
DOI:10.1117/12.2598991
摘要
The sensitivity of atom interferometers depends on the fidelity of the light pulses used as beamsplitters and mirrors. Atom interferometers typically employ pulses that affect π/2 and π fractional Rabi oscillations, the fidelities of which are reduced when there are variations in atomic velocity and laser intensity. We have previously demonstrated the application of optimal control theory to design pulses more robust to such errors; however, if these variations exhibit a time dependence over periods on the order of the interferometer duration then phase shifts can be introduced in the final fringe that potentially reduce the sensitivity. In this paper, we explain why care must be taken when optimising interferometer pulse sequences to ensure that phase shifts arising from inter-pulse variations are not significantly increased. We show that these phase shifts can in fact be minimised by choosing an appropriate measure of individual pulse fidelity.