Mixed Fractional Information: Consistency of Dissipation Measures for Stable Laws
Abstract
Symmetric alpha-stable (S alpha S) distributions with alpha<2 lack finite classical Fisher information. Building on Johnson's framework, we define Mixed Fractional Information (MFI) via the initial rate of relative entropy dissipation during interpolation between S alpha S laws with differing scales, v and s. We demonstrate two equivalent formulations for MFI in this specific S alpha S-to-S alpha S setting. The first involves the derivative D'(v) of the relative entropy between the two S alpha S densities. The second uses an integral expectation E_gv[u(x,0) (pF_v(x) - pF_s(x))] involving the difference between Fisher scores (pF_v, pF_s) and a specific MMSE-related score function u(x,0) derived from the interpolation dynamics. Our central contribution is a rigorous proof of the consistency identity: D'(v) = (1/(alpha v)) E_gv[X (pF_v(X) - pF_s(X))]. This identity mathematically validates the equivalence of the two MFI formulations for S alpha S inputs, establishing MFI's internal coherence and directly linking entropy dissipation rates to score function differences. We further establish MFI's non-negativity (zero if and only if v=s), derive its closed-form expression for the Cauchy case (alpha=1), and numerically validate the consistency identity. MFI provides a finite, coherent, and computable information-theoretic measure for comparing S alpha S distributions where classical Fisher information fails, connecting entropy dynamics to score functions and estimation concepts. This work lays a foundation for exploring potential fractional I-MMSE relations and new functional inequalities tailored to heavy-tailed systems.