Machine Learning and Variational Algorithms for Lattice Field Theory
- URL: http://arxiv.org/abs/2106.01975v1
- Date: Thu, 3 Jun 2021 16:37:05 GMT
- Title: Machine Learning and Variational Algorithms for Lattice Field Theory
- Authors: Gurtej Kanwar
- Abstract summary: In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
- Score: 1.198562319289569
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In lattice quantum field theory studies, parameters defining the lattice
theory must be tuned toward criticality to access continuum physics. Commonly
used Markov chain Monte Carlo (MCMC) methods suffer from critical slowing down
in this limit, restricting the precision of continuum extrapolations. Further
difficulties arise when measuring correlation functions of operators widely
separated in spacetime: for most correlation functions, an exponentially severe
signal-to-noise problem is encountered as the operators are taken to be widely
separated. This dissertation details two new techniques to address these
issues. First, we define a novel MCMC algorithm based on generative flow-based
models. Such models utilize machine learning methods to describe efficient
approximate samplers for distributions of interest. Independently drawn
flow-based samples are then used as proposals in an asymptotically exact
Metropolis-Hastings Markov chain. We address incorporating symmetries of
interest, including translational and gauge symmetries. We secondly introduce
an approach to "deform" Monte Carlo estimators based on contour deformations
applied to the domain of the path integral. The deformed estimators associated
with an observable give equivalent unbiased measurements of that observable,
but generically have different variances. We define families of deformed
manifolds for lattice gauge theories and introduce methods to efficiently
optimize the choice of manifold (the "observifold"), minimizing the deformed
observable variance. Finally, we demonstrate that flow-based MCMC can mitigate
critical slowing down and observifolds can exponentially reduce variance in
proof-of-principle applications to scalar $\phi^4$ theory and $\mathrm{U}(1)$
and $\mathrm{SU}(N)$ lattice gauge theories.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Weighted Riesz Particles [0.0]
We consider the target distribution as a mapping where the infinite-dimensional space of the parameters consists of a number of deterministic submanifolds.
We study the properties of the point, called Riesz, and embed it into sequential MCMC.
We find that there will be higher acceptance rates with fewer evaluations.
arXiv Detail & Related papers (2023-12-01T14:36:46Z) - Object based Bayesian full-waveform inversion for shear elastography [0.0]
We develop a computational framework to quantify uncertainty in shear elastography imaging of anomalies in tissues.
We find the posterior probability of parameter fields representing the geometry of the anomalies and their shear moduli.
We demonstrate the approach on synthetic two dimensional tests with smooth and irregular shapes.
arXiv Detail & Related papers (2023-05-11T08:25:25Z) - Log-density gradient covariance and automatic metric tensors for Riemann
manifold Monte Carlo methods [0.0]
The metric tensor is built from symmetric positive semidefinite log-density covariance gradient matrices.
The proposed methodology is highly automatic and allows for exploitation of any sparsity associated with the model in question.
arXiv Detail & Related papers (2022-11-03T12:22:20Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Nonparametric mixture MLEs under Gaussian-smoothed optimal transport
distance [0.39373541926236766]
We adapt the GOT framework instead of its unsmoothed counterpart to approximate the true data generating distribution.
A key step in our analysis is the establishment of a new Jackson-type approximation bound of Gaussian-convoluted Lipschitz functions.
This insight bridges existing techniques of analyzing the nonparametric MLEs and the new GOT framework.
arXiv Detail & Related papers (2021-12-04T20:05:58Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - On MCMC for variationally sparse Gaussian processes: A pseudo-marginal
approach [0.76146285961466]
Gaussian processes (GPs) are frequently used in machine learning and statistics to construct powerful models.
We propose a pseudo-marginal (PM) scheme that offers exact inference as well as computational gains through doubly estimators for the likelihood and large datasets.
arXiv Detail & Related papers (2021-03-04T20:48:29Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.