Causal Emergence of Consciousness through Learned Multiscale Neural Dynamics in Mice
Abstract
Consciousness spans macroscopic experience and microscopic neuronal activity, yet linking these scales remains challenging. Prevailing theories, such as Integrated Information Theory, focus on a single scale, overlooking how causal power and its dynamics unfold across scales. Progress is constrained by scarce cross-scale data and difficulties in quantifying multiscale causality and dynamics. Here, we present a machine learning framework that infers multiscale causal variables and their dynamics from near-cellular-resolution calcium imaging in the mouse dorsal cortex. At lower levels, variables primarily aggregate input-driven information, whereas at higher levels they realize causality through metastable or saddle-point dynamics during wakefulness, collapsing into localized, stochastic dynamics under anesthesia. A one-dimensional top-level conscious variable captures the majority of causal power, yet variables across other scales also contribute substantially, giving rise to high emergent complexity in the conscious state. Together, these findings provide a multiscale causal framework that links neural activity to conscious states.