A geometric model of synthetic filtrations via context-dependent time
Abstract
Classical filtrations in probability theory formalize the accumulation of information along a linear time axis: the past is unique and the present evolves into an uncertain future. In many contexts, however, information is neither linear nor uniquely determined by a single history. In this paper we propose a geometric model of synthetic filtrations, where the present may be formed by synthesizing multiple possible pasts. To achieve this, we introduce a new category {\Sigma}, an extension of the simplex category {\Delta}, whose objects encode context-dependent times. Synthetic filtrations are realized as contravariant functors {\Sigma} {\to} Prob, where Prob is the category of probability spaces with null-preserving maps. After seeing a general definition of synthetic filtrations, we define, as a concrete example, Dirichlet filtrations, where probability measures on simplices arise from Dirichlet distributions, highlighting the role of parameter and contextual uncertainty. Finally, we interpret Bayesian updating as a categorical update of a Dirichlet functor, showing how learning fits naturally into the synthetic filtration framework. This work combines categorical probability, simplicial geometry, and Bayesian statistics, and suggests new applications in finance, stochastic modeling, and subjective probability.