Fractional-Order Nesterov Dynamics for Convex Optimization
Abstract
We propose and analyze a class of second-order dynamical systems for continuous-time optimization that incorporate fractional-order gradient terms. The system is given by \begin{equation} \ddot{x}(t) + \frac{\alpha}{t}\dot{x}(t) + \nabla^{\theta} f(x(t)) = 0, \end{equation} where $\theta \in (1,2)$, and the fractional operators are interpreted in the sense of Caputo, Riemann--Liouville, and Gr\"unwald--Letnikov derivatives. This formulation interpolates between memory effects of fractional dynamics and higher-order damping mechanisms, thereby extending the classical Nesterov accelerated flow into the fractional domain. A particular focus of our analysis is the regime $\alpha \leq 3$, and especially the critical case $\alpha = 3$, where the ordinary Nesterov flow fails to guarantee convergence. We show that in the fractional setting, convergence can still be established, with fractional gradient terms providing a stabilizing effect that compensates for the borderline damping. This highlights the ability of fractional dynamics to overcome fundamental limitations of classical second-order flows. We develop a convergence analysis framework for such systems by introducing fractional Opial-type lemmas and Lyapunov memory functionals. In the convex case, we establish weak convergence of trajectories toward the minimizer, as well as asymptotic decay of functional values. For strongly convex functions, we obtain explicit convergence rates that improve upon those of standard second-order flows