Stochastic normalizing flows as non-equilibrium transformations
- URL: http://arxiv.org/abs/2201.08862v1
- Date: Fri, 21 Jan 2022 19:00:18 GMT
- Title: Stochastic normalizing flows as non-equilibrium transformations
- Authors: Michele Caselle, Elia Cellini, Alessandro Nada, Marco Panero
- Abstract summary: We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
- Score: 62.997667081978825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a class of deep generative models that provide a
promising route to sample lattice field theories more efficiently than
conventional Monte~Carlo simulations. In this work we show that the theoretical
framework of stochastic normalizing flows, in which neural-network layers are
combined with Monte~Carlo updates, is the same that underlies
out-of-equilibrium simulations based on Jarzynski's equality, which have been
recently deployed to compute free-energy differences in lattice gauge theories.
We lay out a strategy to optimize the efficiency of this extended class of
generative models and present examples of applications.
Related papers
- Self-Boost via Optimal Retraining: An Analysis via Approximate Message Passing [58.52119063742121]
Retraining a model using its own predictions together with the original, potentially noisy labels is a well-known strategy for improving the model performance.<n>This paper addresses the question of how to optimally combine the model's predictions and the provided labels.<n>Our main contribution is the derivation of the Bayes optimal aggregator function to combine the current model's predictions and the given labels.
arXiv Detail & Related papers (2025-05-21T07:16:44Z) - A theoretical framework for overfitting in energy-based modeling [5.1337384597700995]
We investigate the impact of limited data on training pairwise energy-based models for inverse problems aimed at identifying interaction networks.<n>We show that optimal points for early stopping arise from the interplay between these timescales and the initial conditions of training.<n>We propose a generalization to arbitrary energy-based models by deriving the neural tangent kernel dynamics of the score function under the score-matching.
arXiv Detail & Related papers (2025-01-31T14:21:02Z) - Stochastic normalizing flows for Effective String Theory [44.99833362998488]
Effective String Theory (EST) is a powerful tool used to study confinement in pure gauge theories.
Flow-based samplers have been applied as an efficient numerical method to study EST regularized on the lattice.
arXiv Detail & Related papers (2024-12-26T07:58:09Z) - Scaling of Stochastic Normalizing Flows in $\mathrm{SU}(3)$ lattice gauge theory [44.99833362998488]
Non-equilibrium Markov Chain Monte Carlo simulations provide a well-understood framework based on Jarzynski's equality to sample from a target probability distribution.
Out-of-equilibrium evolutions share the same framework of flow-based approaches and they can be naturally combined into a novel architecture called Normalizing Flows (SNFs)
We present the first implementation of SNFs for $mathrmSU(3)$ lattice gauge theory in 4 dimensions, defined by introducing gauge-equivariant layers between out-of-equilibrium Monte Carlo updates.
arXiv Detail & Related papers (2024-11-29T19:01:05Z) - Efficient Training of Energy-Based Models Using Jarzynski Equality [13.636994997309307]
Energy-based models (EBMs) are generative models inspired by statistical physics.
The computation of its gradient with respect to the model parameters requires sampling the model distribution.
Here we show how results for nonequilibrium thermodynamics based on Jarzynski equality can be used to perform this computation efficiently.
arXiv Detail & Related papers (2023-05-30T21:07:52Z) - Locality-constrained autoregressive cum conditional normalizing flow for
lattice field theory simulations [0.0]
Local action integral leads to simplifications to the input domain of conditional normalizing flows.
We find that the autocorrelation times of l-ACNF models outperform an equivalent normalizing flow model on the full lattice by orders of magnitude.
arXiv Detail & Related papers (2023-04-04T13:55:51Z) - Distributed Bayesian Learning of Dynamic States [65.7870637855531]
The proposed algorithm is a distributed Bayesian filtering task for finite-state hidden Markov models.
It can be used for sequential state estimation, as well as for modeling opinion formation over social networks under dynamic environments.
arXiv Detail & Related papers (2022-12-05T19:40:17Z) - Aspects of scaling and scalability for flow-based sampling of lattice
QCD [137.23107300589385]
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
It remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations.
arXiv Detail & Related papers (2022-11-14T17:07:37Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Learning Lattice Quantum Field Theories with Equivariant Continuous
Flows [10.124564216461858]
We propose a novel machine learning method for sampling from the high-dimensional probability distributions of Lattice Field Theories.
We test our model on the $phi4$ theory, showing that it systematically outperforms previously proposed flow-based methods in sampling efficiency.
arXiv Detail & Related papers (2022-07-01T09:20:05Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Flow-based sampling for multimodal distributions in lattice field theory [7.0631812650826085]
We present a set of methods to construct flow models for targets with multiple separated modes.
We demonstrate the application of these methods to modeling two-dimensional real scalar field theory.
arXiv Detail & Related papers (2021-07-01T20:22:10Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Learning CHARME models with neural networks [1.5362025549031046]
We consider a model called CHARME (Conditional Heteroscedastic Autoregressive Mixture of Experts)
As an application, we develop a learning theory for the NN-based autoregressive functions of the model.
arXiv Detail & Related papers (2020-02-08T21:51:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.