An exactly solvable ansatz for statistical mechanics models
- URL: http://arxiv.org/abs/2010.07423v1
- Date: Wed, 14 Oct 2020 22:30:22 GMT
- Title: An exactly solvable ansatz for statistical mechanics models
- Authors: Isaac H. Kim
- Abstract summary: We propose a family of "exactly solvable" probability distributions to approximate partition functions of two-dimensional statistical mechanics models.
The distributions lie strictly outside the mean-field framework, but their free energies can be computed in a time that scales linearly with the system size.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a family of "exactly solvable" probability distributions to
approximate partition functions of two-dimensional statistical mechanics
models. While these distributions lie strictly outside the mean-field
framework, their free energies can be computed in a time that scales linearly
with the system size. This construction is based on a simple but nontrivial
solution to the marginal problem. We formulate two non-linear constraints on
the set of locally consistent marginal probabilities that simultaneously (i)
ensure the existence of a consistent global probability distribution and (ii)
lead to an exact expression for the maximum global entropy.
Related papers
- End-to-End Probabilistic Framework for Learning with Hard Constraints [47.10876360975842]
ProbHardE2E learns systems that can incorporate operational/physical constraints as hard requirements.<n>It enforces hard constraints by exploiting variance information in a novel way.<n>It can incorporate a range of non-linear constraints (increasing the power of modeling and flexibility)
arXiv Detail & Related papers (2025-06-08T05:29:50Z) - Exact Recovery Guarantees for Parameterized Nonlinear System Identification Problem under Sparse Disturbances or Semi-Oblivious Attacks [16.705631360131886]
We study the problem of learning a nonlinear dynamical system by parameterizing its dynamics using basis functions.
We show that finite-time exact recovery is achieved with high probability, even when $p$ approaches 1.
arXiv Detail & Related papers (2024-08-30T22:12:57Z) - Indeterminate Probability Theory [18.320645632562663]
This paper proposes Indeterminate Probability Theory (IPT)<n>IPT is an observer-centered framework in which experimental outcomes are represented as distributions combining ground truth with observation error.<n>IPT is consistent with classical probability theory and subsumes the frequentist equation in the limit of vanishing observation error.
arXiv Detail & Related papers (2023-03-21T01:57:40Z) - A Short and General Duality Proof for Wasserstein Distributionally Robust Optimization [11.034091190797671]
We present a general duality result for Wasserstein distributionally robust optimization that holds for any Kantorovich transport cost, measurable loss function, and nominal probability distribution.
We demonstrate that the interchangeability principle holds if and only if certain measurable projection and weak measurable selection conditions are satisfied.
arXiv Detail & Related papers (2022-04-30T22:49:01Z) - Categorical Distributions of Maximum Entropy under Marginal Constraints [0.0]
estimation of categorical distributions under marginal constraints is key for many machine-learning and data-driven approaches.
We provide a parameter-agnostic theoretical framework that ensures that a categorical distribution of Maximum Entropy under marginal constraints always exists.
arXiv Detail & Related papers (2022-04-07T12:42:58Z) - Probabilistic learning inference of boundary value problem with
uncertainties based on Kullback-Leibler divergence under implicit constraints [0.0]
We present a general methodology of a probabilistic learning inference that allows for estimating a posterior probability model for a boundary value problem from a prior probability model.
A statistical surrogate model of the implicit mapping, which represents the constraints, is introduced.
In a second part, an application is presented to illustrate the proposed theory and is also, as such, a contribution to the three-dimensional homogenization of heterogeneous linear elastic media.
arXiv Detail & Related papers (2022-02-10T16:00:10Z) - Non-Linear Spectral Dimensionality Reduction Under Uncertainty [107.01839211235583]
We propose a new dimensionality reduction framework, called NGEU, which leverages uncertainty information and directly extends several traditional approaches.
We show that the proposed NGEU formulation exhibits a global closed-form solution, and we analyze, based on the Rademacher complexity, how the underlying uncertainties theoretically affect the generalization ability of the framework.
arXiv Detail & Related papers (2022-02-09T19:01:33Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - General stochastic separation theorems with optimal bounds [68.8204255655161]
Phenomenon of separability was revealed and used in machine learning to correct errors of Artificial Intelligence (AI) systems and analyze AI instabilities.
Errors or clusters of errors can be separated from the rest of the data.
The ability to correct an AI system also opens up the possibility of an attack on it, and the high dimensionality induces vulnerabilities caused by the same separability.
arXiv Detail & Related papers (2020-10-11T13:12:41Z) - Robust Finite-State Controllers for Uncertain POMDPs [25.377873201375515]
Uncertain partially observable decision processes (uPOMDPs) allow the probabilistic transition observation functions of standard POMDPs to belong to an uncertainty set.
We develop an algorithm to compute finite-memory policies for uPOMDPs.
arXiv Detail & Related papers (2020-09-24T02:58:50Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z) - The empirical duality gap of constrained statistical learning [115.23598260228587]
We study the study of constrained statistical learning problems, the unconstrained version of which are at the core of virtually all modern information processing.
We propose to tackle the constrained statistical problem overcoming its infinite dimensionality, unknown distributions, and constraints by leveraging finite dimensional parameterizations, sample averages, and duality theory.
We demonstrate the effectiveness and usefulness of this constrained formulation in a fair learning application.
arXiv Detail & Related papers (2020-02-12T19:12:29Z) - Gaussian Variational State Estimation for Nonlinear State-Space Models [0.3222802562733786]
We consider the problem of state estimation, in the context of both filtering and smoothing, for nonlinear state-space models.
We develop an assumed Gaussian solution based on variational inference, which offers the key advantage of a flexible, but principled, mechanism for approxing the required distributions.
arXiv Detail & Related papers (2020-02-07T04:46:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.