Stochastic normalizing flows for Effective String Theory
- URL: http://arxiv.org/abs/2412.19109v2
- Date: Wed, 08 Jan 2025 09:48:02 GMT
- Title: Stochastic normalizing flows for Effective String Theory
- Authors: Michele Caselle, Elia Cellini, Alessandro Nada,
- Abstract summary: Effective String Theory (EST) is a powerful tool used to study confinement in pure gauge theories.
Flow-based samplers have been applied as an efficient numerical method to study EST regularized on the lattice.
- Score: 44.99833362998488
- License:
- Abstract: Effective String Theory (EST) is a powerful tool used to study confinement in pure gauge theories by modeling the confining flux tube connecting a static quark-anti-quark pair as a thin vibrating string. Recently, flow-based samplers have been applied as an efficient numerical method to study EST regularized on the lattice, opening the route to study observables previously inaccessible to standard analytical methods. Flow-based samplers are a class of algorithms based on Normalizing Flows (NFs), deep generative models recently proposed as a promising alternative to traditional Markov Chain Monte Carlo methods in lattice field theory calculations. By combining NF layers with out-of-equilibrium stochastic updates, we obtain Stochastic Normalizing Flows (SNFs), a scalable class of machine learning algorithms that can be explained in terms of stochastic thermodynamics. In this contribution, we outline EST and SNFs, and report some numerical results for the shape of the flux tube.
Related papers
- Numerical determination of the width and shape of the effective string using Stochastic Normalizing Flows [44.99833362998488]
Flow-based architectures have proved to be an efficient tool for numerical simulations of Effective String Theories regularized on the lattice.
In this work we use Normalizing Flows, a state-of-the-art deep learning architecture based on non-equilibrium Monte Carlo simulations, to study different effective string models.
arXiv Detail & Related papers (2024-09-24T09:59:44Z) - Sampling the lattice Nambu-Goto string using Continuous Normalizing
Flows [49.1574468325115]
EST represents a powerful non-perturbative approach to describe confinement in Yang-Mills theory.
We show that by using a new class of deep generative models it is possible to obtain reliable numerical estimates of EST predictions.
arXiv Detail & Related papers (2023-07-03T15:34:36Z) - Locality-constrained autoregressive cum conditional normalizing flow for
lattice field theory simulations [0.0]
Local action integral leads to simplifications to the input domain of conditional normalizing flows.
We find that the autocorrelation times of l-ACNF models outperform an equivalent normalizing flow model on the full lattice by orders of magnitude.
arXiv Detail & Related papers (2023-04-04T13:55:51Z) - Aspects of scaling and scalability for flow-based sampling of lattice
QCD [137.23107300589385]
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
It remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations.
arXiv Detail & Related papers (2022-11-14T17:07:37Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.