Estimation of Thermodynamic Observables in Lattice Field Theories with
Deep Generative Models
- URL: http://arxiv.org/abs/2007.07115v2
- Date: Tue, 5 Jan 2021 09:42:26 GMT
- Title: Estimation of Thermodynamic Observables in Lattice Field Theories with
Deep Generative Models
- Authors: Kim A. Nicoli, Christopher J. Anders, Lena Funcke, Tobias Hartung,
Karl Jansen, Pan Kessel, Shinichi Nakajima, Paolo Stornati
- Abstract summary: We show that generative models can be used to estimate the absolute value of the free energy.
We demonstrate the effectiveness of the proposed method for two-dimensional $phi4$ theory.
- Score: 4.84753320115456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we demonstrate that applying deep generative machine learning
models for lattice field theory is a promising route for solving problems where
Markov Chain Monte Carlo (MCMC) methods are problematic. More specifically, we
show that generative models can be used to estimate the absolute value of the
free energy, which is in contrast to existing MCMC-based methods which are
limited to only estimate free energy differences. We demonstrate the
effectiveness of the proposed method for two-dimensional $\phi^4$ theory and
compare it to MCMC-based methods in detailed numerical experiments.
Related papers
- STANLEY: Stochastic Gradient Anisotropic Langevin Dynamics for Learning
Energy-Based Models [41.031470884141775]
We present an end-to-end learning algorithm for Energy-Based models (EBM)
We propose in this paper, a novel high dimensional sampling method, based on an anisotropic stepsize and a gradient-informed covariance matrix.
Our resulting method, namely STANLEY, is an optimization algorithm for training Energy-Based models via our newly introduced MCMC method.
arXiv Detail & Related papers (2023-10-19T11:55:16Z) - Learning Energy-Based Prior Model with Diffusion-Amortized MCMC [89.95629196907082]
Common practice of learning latent space EBMs with non-convergent short-run MCMC for prior and posterior sampling is hindering the model from further progress.
We introduce a simple but effective diffusion-based amortization method for long-run MCMC sampling and develop a novel learning algorithm for the latent space EBM based on it.
arXiv Detail & Related papers (2023-10-05T00:23:34Z) - Energy Discrepancies: A Score-Independent Loss for Energy-Based Models [20.250792836049882]
We propose a novel loss function called Energy Discrepancy (ED) which does not rely on the computation of scores or expensive Markov chain Monte Carlo.
We show that ED approaches the explicit score matching and negative log-likelihood loss under different limits, effectively interpolating between both.
arXiv Detail & Related papers (2023-07-12T19:51:49Z) - End-To-End Latent Variational Diffusion Models for Inverse Problems in
High Energy Physics [61.44793171735013]
We introduce a novel unified architecture, termed latent variation models, which combines the latent learning of cutting-edge generative art approaches with an end-to-end variational framework.
Our unified approach achieves a distribution-free distance to the truth of over 20 times less than non-latent state-of-the-art baseline.
arXiv Detail & Related papers (2023-05-17T17:43:10Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Particle Dynamics for Learning EBMs [83.59335980576637]
Energy-based modeling is a promising approach to unsupervised learning, which yields many downstream applications from a single model.
The main difficulty in learning energy-based models with the "contrastive approaches" is the generation of samples from the current energy function at each iteration.
This paper proposes an alternative approach to getting these samples and avoiding crude MCMC sampling from the current model.
arXiv Detail & Related papers (2021-11-26T23:41:07Z) - Machine Learning of Thermodynamic Observables in the Presence of Mode
Collapse [5.096726017663865]
Deep generative models allow for the direct estimation of the free energy at a given point in parameter space.
In this contribution, we will review this novel machine-learning-based estimation method.
arXiv Detail & Related papers (2021-11-22T15:59:08Z) - No MCMC for me: Amortized sampling for fast and stable training of
energy-based models [62.1234885852552]
Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty.
We present a simple method for training EBMs at scale using an entropy-regularized generator to amortize the MCMC sampling.
Next, we apply our estimator to the recently proposed Joint Energy Model (JEM), where we match the original performance with faster and stable training.
arXiv Detail & Related papers (2020-10-08T19:17:20Z) - Demystifying Orthogonal Monte Carlo and Beyond [20.745014324028386]
Orthogonal Monte Carlo (OMC) is a very effective sampling algorithm imposing structural geometric conditions (orthogonality) on samples for variance reduction.
We shed new light on the theoretical principles behind OMC, applying theory of negatively dependent random variables to obtain several new concentration results.
We propose a novel extensions of the method leveraging number theory techniques and particle algorithms, called Near-Orthogonal Monte Carlo (NOMC)
arXiv Detail & Related papers (2020-05-27T18:44:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.