Structural Theory of Information Backflow in Non-Markovian Relaxation: TC/TCL Formalism and Minimal Phase Diagrams
- URL: http://arxiv.org/abs/2602.09054v2
- Date: Wed, 11 Feb 2026 07:08:45 GMT
- Title: Structural Theory of Information Backflow in Non-Markovian Relaxation: TC/TCL Formalism and Minimal Phase Diagrams
- Authors: Koichi Nakagawa,
- Abstract summary: We develop a structural theory of information backflow in minimal non-Markovian relaxation processes.<n>The framework provides a constructive TC-to-TCL procedure for extracting effective rates and organizing memory-induced phenomena.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a structural theory of information backflow in minimal non-Markovian relaxation processes within the framework of nonequilibrium statistical mechanics. The approach is based on the time-convolution (TC) and time-convolutionless (TCL) projection-operator formalisms for reduced dynamics and on the doubling construction of non-equilibrium thermo field dynamics, which provides an embedding representation of dissipative evolution. We introduce a general backflow functional associated with a time-dependent information measure and derive generator-based sufficient conditions for the absence of backflow in terms of divisibility properties and effective relaxation rates. This allows a direct connection between memory kernels in generalized master equations and observable transient phenomena such as entropy overshoot and revival. Furthermore, we propose a decomposition of backflow into classical mixing and intrinsic contributions in the doubled representation, leading to a unified classification of transient regimes. Minimal classical and quantum two-state models are analyzed as analytically tractable examples, yielding explicit phase diagrams and recovering Mittag-Leffler-type fractional relaxation as a universal envelope of non-Markovian damping. The framework provides a constructive TC-to-TCL procedure for extracting effective rates and organizing memory-induced phenomena in a model-independent manner.
Related papers
- Thermodynamic Response Functions in Singular Bayesian Models [0.12183405753834557]
We formalize an observable algebra that quotients out non-identifiable directions, allowing structurally meaningful order parameters to be constructed in singular models.<n>Our results suggest that thermodynamic response theory provides a natural organizing framework for interpreting complexity, predictive variability, and structural reorganization in singular Bayesian learning.
arXiv Detail & Related papers (2026-03-05T18:50:20Z) - The Entropic Signature of Class Speciation in Diffusion Models [12.212582407125089]
We show that tracking the class-conditional entropy of a latent semantic variable given a noisy state provides a reliable signature of transition regimes.<n>We validate our method on EDM2-XS and Stable Diffusion 1.5, where class-conditional entropy consistently isolates the noise regimes critical for semantic structure formation.
arXiv Detail & Related papers (2026-02-10T10:56:46Z) - SIGMA: Scalable Spectral Insights for LLM Collapse [51.863164847253366]
We introduce SIGMA (Spectral Inequalities for Gram Matrix Analysis), a unified framework for model collapse.<n>By utilizing benchmarks that deriving and deterministic bounds on the matrix's spectrum, SIGMA provides a mathematically grounded metric to track the contraction of the representation space.<n>We demonstrate that SIGMA effectively captures the transition towards states, offering both theoretical insights into the mechanics of collapse.
arXiv Detail & Related papers (2026-01-06T19:47:11Z) - HFNO: an interpretable data-driven decomposition strategy for turbulent flows [0.0]
We present a novel FNO-based architecture tailored for reduced-order modeling of turbulent fluid flows.<n>The proposed architecture processes wavenumber bins in parallel, enabling approximation of dispersion relations and non-linear interactions.<n>We evaluate the proposed model on a series of increasingly complex dynamical systems.
arXiv Detail & Related papers (2025-11-03T12:57:19Z) - A Free Probabilistic Framework for Denoising Diffusion Models: Entropy, Transport, and Reverse Processes [22.56299060022639]
This paper builds on Voiculescu's theory of free entropy and free Fisher information.<n>We formulate diffusion and quantify reverse processes governed by operator-valued dynamics.<n>The resulting dynamics admit a gradient-flow structure in the noncommutative Wasserstein space.
arXiv Detail & Related papers (2025-10-26T18:03:54Z) - Fractal Flow: Hierarchical and Interpretable Normalizing Flow via Topic Modeling and Recursive Strategy [3.648417123399582]
We propose Fractal Flow, a novel normalizing flow architecture that enhances both expressiveness and interpretability.<n> Experiments on MNIST, FashionMNIST, CIFAR-10, and geophysical data demonstrate that the Fractal Flow achieves latent clustering, controllable generation, and superior estimation accuracy.
arXiv Detail & Related papers (2025-08-27T10:25:15Z) - Loss-Complexity Landscape and Model Structure Functions [53.92822954974537]
We develop a framework for dualizing the Kolmogorov structure function $h_x(alpha)$.<n>We establish a mathematical analogy between information-theoretic constructs and statistical mechanics.<n>We explicitly prove the Legendre-Fenchel duality between the structure function and free energy.
arXiv Detail & Related papers (2025-07-17T21:31:45Z) - Information Must Flow: Recursive Bootstrapping for Information Bottleneck in Optimal Transport [5.234742752529437]
We present a unified framework that models cognition as the directed flow of information between high-entropy context and low-entropy content.<n>Inference emerges as a cycle of bidirectional interactions, bottom-up contextual disambiguation paired with top-down content reconstruction.<n>Building on this, we propose that language emerges as a symbolic transport system, externalizing latent content to synchronize inference cycles across individuals.
arXiv Detail & Related papers (2025-07-08T13:56:50Z) - Generalized Derangetropy Functionals for Modeling Cyclical Information Flow [11.095723123836965]
This paper introduces a framework for modeling cyclical and feedback-driven information flow through a generalized family of entropy-modulated transformations called derangetropy functionals.<n>Unlike scalar and static entropy measures such as Shannon entropy, these functionals act directly on probability densities and provide a topographical representation of information structure across the support of the distribution.
arXiv Detail & Related papers (2025-04-20T13:09:21Z) - Temporal Model On Quantum Logic [0.0]
The framework formalizes the evolution of propositions over time using linear and branching temporal models.<n>The hierarchical organization of memory is represented using directed acyclic graphs.
arXiv Detail & Related papers (2025-02-09T17:16:53Z) - Sequential Representation Learning via Static-Dynamic Conditional Disentanglement [58.19137637859017]
This paper explores self-supervised disentangled representation learning within sequential data, focusing on separating time-independent and time-varying factors in videos.
We propose a new model that breaks the usual independence assumption between those factors by explicitly accounting for the causal relationship between the static/dynamic variables.
Experiments show that the proposed approach outperforms previous complex state-of-the-art techniques in scenarios where the dynamics of a scene are influenced by its content.
arXiv Detail & Related papers (2024-08-10T17:04:39Z) - DA-Flow: Dual Attention Normalizing Flow for Skeleton-based Video Anomaly Detection [52.74152717667157]
We propose a lightweight module called Dual Attention Module (DAM) for capturing cross-dimension interaction relationships in-temporal skeletal data.
It employs the frame attention mechanism to identify the most significant frames and the skeleton attention mechanism to capture broader relationships across fixed partitions with minimal parameters and flops.
arXiv Detail & Related papers (2024-06-05T06:18:03Z) - Loschmidt echo, emerging dual unitarity and scaling of generalized temporal entropies after quenches to the critical point [0.0]
We show how the Loschmidt echo of a product state after a quench can be predicted by using conformal field theories.<n>We are also able to predict and confirm an emerging dual-unitarity of the evolution at late times.
arXiv Detail & Related papers (2024-05-23T15:40:37Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - Towards Robust and Adaptive Motion Forecasting: A Causal Representation
Perspective [72.55093886515824]
We introduce a causal formalism of motion forecasting, which casts the problem as a dynamic process with three groups of latent variables.
We devise a modular architecture that factorizes the representations of invariant mechanisms and style confounders to approximate a causal graph.
Experiment results on synthetic and real datasets show that our three proposed components significantly improve the robustness and reusability of the learned motion representations.
arXiv Detail & Related papers (2021-11-29T18:59:09Z) - Generative Flows with Invertible Attentions [135.23766216657745]
We introduce two types of invertible attention mechanisms for generative flow models.
We exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps.
Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models.
arXiv Detail & Related papers (2021-06-07T20:43:04Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.