On the model attractor in a high-dimensional neural network dynamics of reservoir computing
Abstract
The theory of embedding and generalized synchronization in reservoir computing has recently been developed. Under ideal conditions, reservoir computing exhibits generalized synchronization during the learning process. These insights form a rigorous basis for understanding reservoir computing ability to reconstruct and predict complex dynamics. In this study, we clarified the dynamical system structures of generalized synchronization and embedding by comparing the Lyapunov exponents of a high dimensional neural network within the reservoir computing model with those in actual systems. Furthermore, we numerically calculated the Lyapunov exponents restricted to the tangent space of the inertial manifold in a high dimensional neural network. Our results demonstrate that all Lyapunov exponents of the actual dynamics, including negative ones, are successfully identified.