Enhancing Neural Network Backflow
Abstract
Accurately describing the ground state of strongly correlated systems is essential for understanding their emergent properties. Neural Network Backflow (NNBF) is a powerful variational ansatz that enhances mean-field wave functions by introducing configuration-dependent modifications to single-particle orbitals. Although NNBF is theoretically universal in the limit of large networks, we find that practical gains saturate with increasing network size. Instead, significant improvements can be achieved by using a multi-determinant ansatz. We explore efficient ways to generate these multi-determinant expansions without increasing the number of variational parameters. In particular, we study single-step Lanczos and symmetry projection techniques, benchmarking their performance against diffusion Monte Carlo and NNBF applied to alternative mean fields. Benchmarking on a doped periodic square Hubbard model near optimal doping, we find that a Lanczos step, diffusion Monte Carlo, and projection onto a symmetry sector all give similar improvements achieving state-of-the-art energies at minimal cost. By further optimizing the projected symmetrized states directly, we gain significantly in energy. Using this technique we report the lowest variational energies for this Hamiltonian on $4\times 16$ and $4 \times 8$ lattices as well as accurate variance extrapolated energies. We also show the evolution of spin, charge, and pair correlation functions as the quality of the variational ansatz improves.