Stochastic normalizing flows as non-equilibrium transformations
- URL: http://arxiv.org/abs/2201.08862v1
- Date: Fri, 21 Jan 2022 19:00:18 GMT
- Title: Stochastic normalizing flows as non-equilibrium transformations
- Authors: Michele Caselle, Elia Cellini, Alessandro Nada, Marco Panero
- Abstract summary: We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
- Score: 62.997667081978825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a class of deep generative models that provide a
promising route to sample lattice field theories more efficiently than
conventional Monte~Carlo simulations. In this work we show that the theoretical
framework of stochastic normalizing flows, in which neural-network layers are
combined with Monte~Carlo updates, is the same that underlies
out-of-equilibrium simulations based on Jarzynski's equality, which have been
recently deployed to compute free-energy differences in lattice gauge theories.
We lay out a strategy to optimize the efficiency of this extended class of
generative models and present examples of applications.
Related papers
- Efficient Training of Energy-Based Models Using Jarzynski Equality [13.636994997309307]
Energy-based models (EBMs) are generative models inspired by statistical physics.
The computation of its gradient with respect to the model parameters requires sampling the model distribution.
Here we show how results for nonequilibrium thermodynamics based on Jarzynski equality can be used to perform this computation efficiently.
arXiv Detail & Related papers (2023-05-30T21:07:52Z) - Locality-constrained autoregressive cum conditional normalizing flow for
lattice field theory simulations [0.0]
Local action integral leads to simplifications to the input domain of conditional normalizing flows.
We find that the autocorrelation times of l-ACNF models outperform an equivalent normalizing flow model on the full lattice by orders of magnitude.
arXiv Detail & Related papers (2023-04-04T13:55:51Z) - Distributed Bayesian Learning of Dynamic States [65.7870637855531]
The proposed algorithm is a distributed Bayesian filtering task for finite-state hidden Markov models.
It can be used for sequential state estimation, as well as for modeling opinion formation over social networks under dynamic environments.
arXiv Detail & Related papers (2022-12-05T19:40:17Z) - Aspects of scaling and scalability for flow-based sampling of lattice
QCD [137.23107300589385]
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
It remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations.
arXiv Detail & Related papers (2022-11-14T17:07:37Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Learning Lattice Quantum Field Theories with Equivariant Continuous
Flows [10.124564216461858]
We propose a novel machine learning method for sampling from the high-dimensional probability distributions of Lattice Field Theories.
We test our model on the $phi4$ theory, showing that it systematically outperforms previously proposed flow-based methods in sampling efficiency.
arXiv Detail & Related papers (2022-07-01T09:20:05Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Flow-based sampling for multimodal distributions in lattice field theory [7.0631812650826085]
We present a set of methods to construct flow models for targets with multiple separated modes.
We demonstrate the application of these methods to modeling two-dimensional real scalar field theory.
arXiv Detail & Related papers (2021-07-01T20:22:10Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Learning CHARME models with neural networks [1.5362025549031046]
We consider a model called CHARME (Conditional Heteroscedastic Autoregressive Mixture of Experts)
As an application, we develop a learning theory for the NN-based autoregressive functions of the model.
arXiv Detail & Related papers (2020-02-08T21:51:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.