Locality-constrained autoregressive cum conditional normalizing flow for
lattice field theory simulations
- URL: http://arxiv.org/abs/2304.01798v1
- Date: Tue, 4 Apr 2023 13:55:51 GMT
- Title: Locality-constrained autoregressive cum conditional normalizing flow for
lattice field theory simulations
- Authors: Dinesh P. R.
- Abstract summary: Local action integral leads to simplifications to the input domain of conditional normalizing flows.
We find that the autocorrelation times of l-ACNF models outperform an equivalent normalizing flow model on the full lattice by orders of magnitude.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Normalizing flow-based sampling methods have been successful in tackling
computational challenges traditionally associated with simulating lattice
quantum field theories. Further works have incorporated gauge and translational
invariance of the action integral in the underlying neural networks, which have
led to efficient training and inference in those models. In this paper, we
incorporate locality of the action integral which leads to simplifications to
the input domain of conditional normalizing flows that sample constant time
sub-lattices in an autoregressive process, dubbed local-Autoregressive
Conditional Normalizing Flow (l-ACNF). We find that the autocorrelation times
of l-ACNF models outperform an equivalent normalizing flow model on the full
lattice by orders of magnitude when sampling $\phi^{4}$ theory on a 2
dimensional lattice.
Related papers
- Reflected Flow Matching [36.38883647601013]
Continuous normalizing flows (CNFs) learn an ordinary differential equation to transform prior samples into data.
Flow matching (FM) has emerged as a simulation-free approach for training CNFs by regressing a velocity model towards the conditional velocity field.
We propose reflected flow matching (RFM) to train the velocity model in reflected CNFs by matching the conditional velocity fields in a simulation-free manner.
arXiv Detail & Related papers (2024-05-26T14:09:43Z) - Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
Lattice Field Theories [6.222204646855336]
We study the consequences of mode-collapse of normalizing flows in the context of lattice field theory.
We propose a metric to quantify the degree of mode-collapse and derive a bound on the resulting bias.
arXiv Detail & Related papers (2023-02-27T19:00:22Z) - Aspects of scaling and scalability for flow-based sampling of lattice
QCD [137.23107300589385]
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
It remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations.
arXiv Detail & Related papers (2022-11-14T17:07:37Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.