Hybrid between biologically inspired and quantum inspired many-body states
Abstract
Deep neural networks can represent very different sorts of functions, including complex quantum many-body states. Tensor networks can also represent these states, have more structure and are easier to optimize. However, they can be prohibitively costly computationally in two or higher dimensions. Here, we propose a generalization of the perceptron - the perceptrain - which borrows features from the two different formalisms. We construct variational many-body ansatz from a simple network of perceptrains. The network can be thought of as a neural network with a few distinct features inherited from tensor networks. These include efficient local optimization akin to the density matrix renormalization algorithm, instead of optimizing of all the parameters at once; the possibility to dynamically increase the number of parameters during the optimization; the possibility to compress the state to avoid overfitting; and a structure that remains quantum-inspired. We showcase the ansatz using a combination of Variational Monte-Carlo (VMC) and Green Function Monte-Carlo (GFMC) on a $10\times 10$ transverse field quantum Ising model with a long range $1/r^6$ antiferromagnetic interaction. The model corresponds to the Rydberg (cold) atoms platform proposed for quantum annealing. We consistently find a very high relative accuracy for the ground state energy, around $10^{-5}$ for VMC and $10^{-6}$ for GFMC in all regimes of parameters, including in the vicinity of the quantum phase transition. We used very small ranks ($\sim 2-5$) of perceptrains, as opposed to multiples of thousand used in matrix product states. The optimization of the energy was robust with respect to the choice of initial conditions and hyper-parameters, in contrast to a common experience when using neural network wave functions.