Thermodynamic Response Functions in Singular Bayesian Models
- URL: http://arxiv.org/abs/2603.05480v1
- Date: Thu, 05 Mar 2026 18:50:20 GMT
- Title: Thermodynamic Response Functions in Singular Bayesian Models
- Authors: Sean Plummer,
- Abstract summary: We formalize an observable algebra that quotients out non-identifiable directions, allowing structurally meaningful order parameters to be constructed in singular models.<n>Our results suggest that thermodynamic response theory provides a natural organizing framework for interpreting complexity, predictive variability, and structural reorganization in singular Bayesian learning.
- Score: 0.12183405753834557
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Singular statistical models-including mixtures, matrix factorization, and neural networks-violate regular asymptotics due to parameter non-identifiability and degenerate Fisher geometry. Although singular learning theory characterizes marginal likelihood behavior through invariants such as the real log canonical threshold and singular fluctuation, these quantities remain difficult to interpret operationally. At the same time, widely used criteria such as WAIC and WBIC appear disconnected from underlying singular geometry. We show that posterior tempering induces a one-parameter deformation of the posterior distribution whose associated observables generate a hierarchy of thermodynamic response functions. A universal covariance identity links derivatives of tempered expectations to posterior fluctuations, placing WAIC, WBIC, and singular fluctuation within a unified response framework. Within this framework, classical quantities from singular learning theory acquire natural thermodynamic interpretations: RLCT governs the leading free-energy slope, singular fluctuation corresponds to curvature of the tempered free energy, and WAIC measures predictive fluctuation. We formalize an observable algebra that quotients out non-identifiable directions, allowing structurally meaningful order parameters to be constructed in singular models. Across canonical singular examples-including symmetric Gaussian mixtures, reduced-rank regression, and overparameterized neural networks-we empirically demonstrate phase-transition-like behavior under tempering. Order parameters collapse, susceptibilities peak, and complexity measures align with structural reorganization in posterior geometry. Our results suggest that thermodynamic response theory provides a natural organizing framework for interpreting complexity, predictive variability, and structural reorganization in singular Bayesian learning.
Related papers
- Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation [56.361076943802594]
CanonFlow achieves state-of-the-art performance on the challenging GEOM-DRUG dataset, and the advantage remains large in few-step generation.
arXiv Detail & Related papers (2026-02-16T18:58:55Z) - Implicit bias as a Gauge correction: Theory and Inverse Design [2.9379512315137117]
A central problem in machine learning theory is to characterize how learning dynamics select particular solutions compatible with the training objective.<n>We identify a general mechanism, in terms of an explicit correction of the learning dynamics, for the emergence of implicit biases.<n>We compute the resulting induced bias for a range of dynamics, showing how several well known results fit into a single unified framework.
arXiv Detail & Related papers (2026-01-10T15:33:09Z) - Random-Matrix-Induced Simplicity Bias in Over-parameterized Variational Quantum Circuits [72.0643009153473]
We show that expressive variational ansatze enter a Haar-like universality class in which both observable expectation values and parameter gradients concentrate exponentially with system size.<n>As a consequence, the hypothesis class induced by such circuits collapses with high probability to a narrow family of near-constant functions.<n>We further show that this collapse is not unavoidable: tensor-structured VQCs, including tensor-network-based and tensor-hypernetwork parameterizations, lie outside the Haar-like universality class.
arXiv Detail & Related papers (2026-01-05T08:04:33Z) - Thermodynamic Characterizations of Singular Bayesian Models: Specific Heat, Susceptibility, and Entropy Flow in Posterior Geometry [0.12183405753834557]
We show that singular fluctuation admits a precise and natural interpretation as a emphspecific heat<n>We also introduce a collection of related thermodynamic quantities, including entropy flow, prior susceptibility, and cross-susceptibility.<n>Our results establish a concrete bridge between singular learning theory and statistical mechanics.
arXiv Detail & Related papers (2025-12-24T20:19:28Z) - Chaos, Entanglement and Measurement: Field-Theoretic Perspectives on Quantum Information Dynamics [0.0]
I study scrambling and pseudorandomness in the Brownian Sachdev-Ye-Kitaev (SYK) model.<n>I construct a field theory for weakly measured SYK clusters.<n>I develop a strong-disorder renormalization group for measurement-only SYK clusters.
arXiv Detail & Related papers (2025-12-11T10:04:30Z) - Mereological Quantum Phase Transitions [0.0]
We introduce the concept of mereological quantum phase transition (m-QPTs)<n>We illustrate this framework through analytical examples involving quantum coherence and operator entanglement.
arXiv Detail & Related papers (2025-10-07T19:12:31Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
We present a unifying perspective on recent results on ridge regression.<n>We use the basic tools of random matrix theory and free probability, aimed at readers with backgrounds in physics and deep learning.<n>Our results extend and provide a unifying perspective on earlier models of scaling laws.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Nonparametric Partial Disentanglement via Mechanism Sparsity: Sparse
Actions, Interventions and Sparse Temporal Dependencies [58.179981892921056]
This work introduces a novel principle for disentanglement we call mechanism sparsity regularization.
We propose a representation learning method that induces disentanglement by simultaneously learning the latent factors.
We show that the latent factors can be recovered by regularizing the learned causal graph to be sparse.
arXiv Detail & Related papers (2024-01-10T02:38:21Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z) - Adding machine learning within Hamiltonians: Renormalization group
transformations, symmetry breaking and restoration [0.0]
We include the predictive function of a neural network, designed for phase classification, as a conjugate variable coupled to an external field within the Hamiltonian of a system.
Results show that the field can induce an order-disorder phase transition by breaking or restoring the symmetry.
We conclude by discussing how the method provides an essential step toward bridging machine learning and physics.
arXiv Detail & Related papers (2020-09-30T18:44:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.