Sampling the lattice Nambu-Goto string using Continuous Normalizing
Flows
- URL: http://arxiv.org/abs/2307.01107v2
- Date: Mon, 12 Feb 2024 10:03:28 GMT
- Title: Sampling the lattice Nambu-Goto string using Continuous Normalizing
Flows
- Authors: Michele Caselle, Elia Cellini and Alessandro Nada
- Abstract summary: EST represents a powerful non-perturbative approach to describe confinement in Yang-Mills theory.
We show that by using a new class of deep generative models it is possible to obtain reliable numerical estimates of EST predictions.
- Score: 49.1574468325115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Effective String Theory (EST) represents a powerful non-perturbative approach
to describe confinement in Yang-Mills theory that models the confining flux
tube as a thin vibrating string. EST calculations are usually performed using
the zeta-function regularization: however there are situations (for instance
the study of the shape of the flux tube or of the higher order corrections
beyond the Nambu-Goto EST) which involve observables that are too complex to be
addressed in this way. In this paper we propose a numerical approach based on
recent advances in machine learning methods to circumvent this problem. Using
as a laboratory the Nambu-Goto string, we show that by using a new class of
deep generative models called Continuous Normalizing Flows it is possible to
obtain reliable numerical estimates of EST predictions.
Related papers
- Numerical determination of the width and shape of the effective string using Stochastic Normalizing Flows [44.99833362998488]
Flow-based architectures have proved to be an efficient tool for numerical simulations of Effective String Theories regularized on the lattice.
In this work we use Normalizing Flows, a state-of-the-art deep-learning architecture based on non-equilibrium Monte Carlo simulations, to study different effective string models.
arXiv Detail & Related papers (2024-09-24T09:59:44Z) - Stable generative modeling using Schrödinger bridges [0.22499166814992438]
We propose a generative model combining Schr"odinger bridges and Langevin dynamics.
Our framework can be naturally extended to generate conditional samples and to Bayesian inference problems.
arXiv Detail & Related papers (2024-01-09T06:15:45Z) - A Metalearned Neural Circuit for Nonparametric Bayesian Inference [4.767884267554628]
Most applications of machine learning to classification assume a closed set of balanced classes.
This is at odds with the real world, where class occurrence statistics often follow a long-tailed power-law distribution.
We present a method for extracting the inductive bias from a nonparametric Bayesian model and transferring it to an artificial neural network.
arXiv Detail & Related papers (2023-11-24T16:43:17Z) - Aspects of scaling and scalability for flow-based sampling of lattice
QCD [137.23107300589385]
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
It remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations.
arXiv Detail & Related papers (2022-11-14T17:07:37Z) - Flow-based sampling in the lattice Schwinger model at criticality [54.48885403692739]
Flow-based algorithms may provide efficient sampling of field distributions for lattice field theory applications.
We provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.
arXiv Detail & Related papers (2022-02-23T19:00:00Z) - Robust Bayesian Inference for Simulator-based Models via the MMD
Posterior Bootstrap [13.448658162594604]
We propose a novel algorithm based on the posterior bootstrap and maximum mean discrepancy estimators.
This leads to a highly-parallelisable Bayesian inference algorithm with strong properties.
The approach is then assessed on a range of examples including a g-and-k distribution and a toggle-switch model.
arXiv Detail & Related papers (2022-02-09T22:12:19Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.