Numerical determination of the width and shape of the effective string using Stochastic Normalizing Flows
- URL: http://arxiv.org/abs/2409.15937v1
- Date: Tue, 24 Sep 2024 09:59:44 GMT
- Title: Numerical determination of the width and shape of the effective string using Stochastic Normalizing Flows
- Authors: Michele Caselle, Elia Cellini, Alessandro Nada,
- Abstract summary: Flow-based architectures have proved to be an efficient tool for numerical simulations of Effective String Theories regularized on the lattice.
In this work we use Normalizing Flows, a state-of-the-art deep-learning architecture based on non-equilibrium Monte Carlo simulations, to study different effective string models.
- Score: 44.99833362998488
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Flow-based architectures have recently proved to be an efficient tool for numerical simulations of Effective String Theories regularized on the lattice that otherwise cannot be efficiently sampled by standard Monte Carlo methods. In this work we use Stochastic Normalizing Flows, a state-of-the-art deep-learning architecture based on non-equilibrium Monte Carlo simulations, to study different effective string models. After testing the reliability of this approach through a comparison with exact results for the Nambu-Got\={o} model, we discuss results on observables that are challenging to study analytically, such as the width of the string and the shape of the flux density. Furthermore, we perform a novel numerical study of Effective String Theories with terms beyond the Nambu-Got\={o} action, including a broader discussion on their significance for lattice gauge theories. These results establish the reliability and feasibility of flow-based samplers for Effective String Theories and pave the way for future applications on more complex models.
Related papers
- Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Sampling the lattice Nambu-Goto string using Continuous Normalizing
Flows [49.1574468325115]
EST represents a powerful non-perturbative approach to describe confinement in Yang-Mills theory.
We show that by using a new class of deep generative models it is possible to obtain reliable numerical estimates of EST predictions.
arXiv Detail & Related papers (2023-07-03T15:34:36Z) - Conditional Measurement Density Estimation in Sequential Monte Carlo via
Normalizing Flow [12.161649672131286]
We propose to learn expressive and valid probability densities in measurement models through conditional normalizing flows.
We show that the proposed approach leads to improved estimation performance and faster training convergence in a visual tracking experiment.
arXiv Detail & Related papers (2022-03-16T13:35:16Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Slice Sampling for General Completely Random Measures [74.24975039689893]
We present a novel Markov chain Monte Carlo algorithm for posterior inference that adaptively sets the truncation level using auxiliary slice variables.
The efficacy of the proposed algorithm is evaluated on several popular nonparametric models.
arXiv Detail & Related papers (2020-06-24T17:53:53Z) - Amortized Bayesian model comparison with evidential deep learning [0.12314765641075436]
We propose a novel method for performing Bayesian model comparison using specialized deep learning architectures.
Our method is purely simulation-based and circumvents the step of explicitly fitting all alternative models under consideration to each observed dataset.
We show that our method achieves excellent results in terms of accuracy, calibration, and efficiency across the examples considered in this work.
arXiv Detail & Related papers (2020-04-22T15:15:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.