BEDS : Bayesian Emergent Dissipative Structures : A Formal Framework for Continuous Inference Under Energy Constraints
- URL: http://arxiv.org/abs/2601.02329v2
- Date: Wed, 07 Jan 2026 08:44:10 GMT
- Title: BEDS : Bayesian Emergent Dissipative Structures : A Formal Framework for Continuous Inference Under Energy Constraints
- Authors: Laurent Caraffa,
- Abstract summary: We introduce BEDS, a formal framework for analyzing inference systems that must maintain beliefs continuously under energy constraints.<n>We prove a central result linking energy, precision, and dissipation.<n>We propose the Gdel-Landauer-Prigogine conjecture, suggesting that closure pathologies across formal systems, computation, and thermodynamics share a common structure.
- Score: 0.6345523830122167
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce BEDS (Bayesian Emergent Dissipative Structures), a formal framework for analyzing inference systems that must maintain beliefs continuously under energy constraints. Unlike classical computational models that assume perfect memory and focus on one-shot computation, BEDS explicitly incorporates dissipation (information loss over time) as a fundamental constraint. We prove a central result linking energy, precision, and dissipation: maintaining a belief with precision $τ$ against dissipation rate $γ$ requires power $P \geq γk_{\rm B} T / 2$, with scaling $P \propto γ\cdot τ$. This establishes a fundamental thermodynamic cost for continuous inference. We define three classes of problems -- BEDS-attainable, BEDS-maintainable, and BEDS-crystallizable -- and show these are distinct from classical decidability. We propose the Gödel-Landauer-Prigogine conjecture, suggesting that closure pathologies across formal systems, computation, and thermodynamics share a common structure.
Related papers
- Thermodynamic Limits of Physical Intelligence [0.3580891736370874]
Modern AI systems achieve remarkable capabilities at the cost of substantial energy consumption.<n>We propose two bits-per-joule metrics under explicit accounting conventions to connect intelligence to physical efficiency.<n>We show how a Landauer-scale closed-cycle benchmark for epiplexity acquisition follows as a corollary of a thermodynamic-learning inequality.
arXiv Detail & Related papers (2026-02-05T09:12:43Z) - Avoiding Premature Collapse: Adaptive Annealing for Entropy-Regularized Structural Inference [1.7523718031184992]
We identify a fundamental mechanism for this failure: textbfPremature Mode Collapse.<n>We propose textbfEfficient Piecewise Hybrid Adaptive Stability Control (EPH-ASC), an adaptive scheduling algorithm that monitors the stability of the inference process.
arXiv Detail & Related papers (2026-01-30T14:47:18Z) - Dissipative Learning: A Framework for Viable Adaptive Systems [0.6345523830122167]
We introduce the BEDS (Bayesian Emergent Dissipative Structures) framework, modeling learning as the evolution of compressed belief states under dissipation constraints.<n>A central contribution is the Optimality Theorem, showing that Fisher-Rao regularization measuring change via information divergence rather than Euclidean distance is the unique thermodynamically optimal regularization strategy.
arXiv Detail & Related papers (2026-01-25T18:10:15Z) - The Procrustean Bed of Time Series: The Optimization Bias of Point-wise Loss [53.542743390809356]
This paper aims to provide a first-principles analysis of the Expectation of Optimization Bias (EOB)<n>Our analysis reveals a fundamental paradigm paradox: the more deterministic and structured the time series, the more severe the bias by point-wise loss function.<n>We present a concrete solution that simultaneously achieves both principles via DFT or DWT.
arXiv Detail & Related papers (2025-12-21T06:08:22Z) - Memory-Amortized Inference: A Topological Unification of Search, Closure, and Structure [6.0044467881527614]
We propose textbfMemory-Amortized Inference (MAI), a formal framework that unifies learning and memory as phase transitions of a single geometric substrate.<n>We show that cognition operates by converting high-complexity search into low-complexity lookup.<n>This framework offers a rigorous explanation for the emergence of fast-thinking (intuition) from slow-thinking (reasoning)
arXiv Detail & Related papers (2025-11-28T16:28:24Z) - Quantum Circuit Reasoning Models: A Variational Framework for Differentiable Logical Inference [0.0]
This report introduces a novel class of reasoning architectures, termed Quantum Circuit Reasoning Models (QCRM)<n>We show how logical rules can be encoded as unitary transformations over proposition-qubit states.<n>We propose the Quantum Reasoning Layer (QRL) as a differentiable hybrid component for composable reasoning models applicable to scientific, biomedical, and chemical inference domains.
arXiv Detail & Related papers (2025-11-26T23:15:14Z) - Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints [7.411478588468014]
We propose a theoretical framework that treats information processing as an enabling mapping from ontological states to carrier states.<n>We introduce a novel metric, Derivation Entropy, which quantifies the effective work required to compute a target state from a given logical depth.<n>Our findings suggest that the minimization of Derivation Entropy is a governing principle for the evolution of both biological and artificial intelligence.
arXiv Detail & Related papers (2025-11-24T14:24:08Z) - Simulation-free Structure Learning for Stochastic Dynamics [39.468930729022546]
We present StructureFlow, a novel and principled simulation-free approach for jointly learning the structure and population dynamics of physical systems.<n>We show that StructureFlow can learn the structure of underlying systems while simultaneously modeling their conditional population dynamics.
arXiv Detail & Related papers (2025-10-18T22:31:39Z) - Newton to Einstein: Axiom-Based Discovery via Game Design [55.30047000068118]
We propose a game design framework in which scientific inquiry is recast as a rule-evolving system.<n>Unlike conventional ML approaches that operate within fixed assumptions, our method enables the discovery of new theoretical structures.
arXiv Detail & Related papers (2025-09-05T18:59:18Z) - ERIS: An Energy-Guided Feature Disentanglement Framework for Out-of-Distribution Time Series Classification [51.07970070817353]
An ideal time series classification (TSC) should be able to capture invariant representations.<n>Current methods are largely unguided, lacking the semantic direction required to isolate truly universal features.<n>We propose an end-to-end Energy-Regularized Information for Shift-Robustness framework to enable guided and reliable feature disentanglement.
arXiv Detail & Related papers (2025-08-19T12:13:41Z) - On Context-Content Uncertainty Principle [5.234742752529437]
We develop a layered computational framework that derives operational principles from the Context-Content Uncertainty Principle.<n>At the base level, CCUP formalizes inference as directional entropy minimization, establishing a variational gradient that favors content-first structuring.<n>We present formal equivalence theorems, a dependency lattice among principles, and computational simulations demonstrating the efficiency gains of CCUP-aligned inference.
arXiv Detail & Related papers (2025-06-25T17:21:19Z) - The Quantum Toll Framework: A Thermodynamic Model of Collapse and Coherence [0.0]
We present a thermodynamic rendering model in which the traditional quantum observer is reframed as a special case of a coherence-constrained interface.<n>We show that the thermodynamic cost of observation includes not only information erasure but also the stabilization of rendered states.<n>This model accounts for classical emergence, time asymmetry, and measurement without invoking consciousness or symbolic cognition.
arXiv Detail & Related papers (2025-05-10T04:49:27Z) - Towards a Generalized Theory of Observers [41.94295877935867]
We argue that observers serve as indispensable reference points for measurement, reference frames, and the emergence of meaning.<n>We show how this formalism sheds new light on debates related to consciousness, quantum measurement, and computational boundaries.
arXiv Detail & Related papers (2025-04-22T19:35:55Z) - Learning Discrete Concepts in Latent Hierarchical Models [73.01229236386148]
Learning concepts from natural high-dimensional data holds potential in building human-aligned and interpretable machine learning models.<n>We formalize concepts as discrete latent causal variables that are related via a hierarchical causal model.<n>We substantiate our theoretical claims with synthetic data experiments.
arXiv Detail & Related papers (2024-06-01T18:01:03Z) - A Robustness Analysis of Blind Source Separation [91.3755431537592]
Blind source separation (BSS) aims to recover an unobserved signal from its mixture $X=f(S)$ under the condition that the transformation $f$ is invertible but unknown.
We present a general framework for analysing such violations and quantifying their impact on the blind recovery of $S$ from $X$.
We show that a generic BSS-solution in response to general deviations from its defining structural assumptions can be profitably analysed in the form of explicit continuity guarantees.
arXiv Detail & Related papers (2023-03-17T16:30:51Z) - New insights on the quantum-classical division in light of Collapse
Models [63.942632088208505]
We argue that the division between quantum and classical behaviors is analogous to the division of thermodynamic phases.
A specific relationship between the collapse parameter $(lambda)$ and the collapse length scale ($r_C$) plays the role of the coexistence curve in usual thermodynamic phase diagrams.
arXiv Detail & Related papers (2022-10-19T14:51:21Z) - A Systems Theory of Transfer Learning [3.5281112495479245]
We use Mesarovician systems theory to define transfer learning as a relation on sets.
We then characterize the general nature of transfer learning as a mathematical construct.
Despite its formalism, our framework avoids the detailed mathematics of learning theory or machine learning solution methods.
arXiv Detail & Related papers (2021-07-02T17:25:42Z) - Variational Causal Networks: Approximate Bayesian Inference over Causal
Structures [132.74509389517203]
We introduce a parametric variational family modelled by an autoregressive distribution over the space of discrete DAGs.
In experiments, we demonstrate that the proposed variational posterior is able to provide a good approximation of the true posterior.
arXiv Detail & Related papers (2021-06-14T17:52:49Z) - Causal Expectation-Maximisation [70.45873402967297]
We show that causal inference is NP-hard even in models characterised by polytree-shaped graphs.
We introduce the causal EM algorithm to reconstruct the uncertainty about the latent variables from data about categorical manifest variables.
We argue that there appears to be an unnoticed limitation to the trending idea that counterfactual bounds can often be computed without knowledge of the structural equations.
arXiv Detail & Related papers (2020-11-04T10:25:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.