Generative sampling with physics-informed kernels
Abstract
We construct a generative network for Monte-Carlo sampling in lattice field theories and beyond, for which the learning of layerwise propagation is done and optimised independently on each layer. The architecture uses physics-informed renormalisation group flows that provide access to the layerwise propagation step from one layer to the next in terms of a simple first order partial differential equation for the respective renormalisation group kernel through a given layer. Thus, it transforms the generative task into that of solving once the set of independent and linear differential equations for the kernels of the transformation. As these equations are analytically known, the kernels can be refined iteratively. This allows us to structurally tackle out-of-domain problems generally encountered in generative models and opens the path to further optimisation. We illustrate the practical feasibility of the architecture within simulations in scalar field theories.