Wavefront Curvature and Transverse Atomic Motion in Time-Resolved Atom Interferometry: Impact and Mitigation
Abstract
Time-resolved atom interferometry, as employed in applications such as gravitational wave detection and searches for ultra-light dark matter, requires precise control over systematic effects. In this work, we investigate phase noise arising from shot-to-shot fluctuations in the atoms' transverse motion in the presence of the wavefront curvature of the interferometer beam, and analyse its dependence on the laser-beam geometry in long-baseline, large-momentum-transfer atom interferometers. We use a semi-classical framework to derive analytical expressions for the effective phase perturbation in position-averaged measurements and validate them using Monte Carlo simulations. Applied to 100-m and 1-km atom gradiometers representative of next-generation experiments, the model shows that configurations maximizing pulse efficiency also amplify curvature-induced phase noise, requiring micron-level control of the atom cloud's centre-of-mass position and sub-micron-per-second control of its centre-of-mass velocity to achieve sub-$10^{-5}$ rad phase stability. Alternative beam geometries can suppress this noise by up to two orders of magnitude, but at the cost of reduced pulse efficiency. To address this limitation, we propose a mitigation strategy based on position-resolved phase-shift readout, which empirically learns and corrects the wavefront-induced bias from measurable quantities such as the phase-shift gradient and final cloud position. This approach restores high-sensitivity operation in the maximum-pulse-efficiency configuration without detailed beam characterisation, providing a practical route towards next-generation, time-resolved atom interferometers operating at the $10^{-5}$ rad noise level.