Multi-Lattice Sampling of Quantum Field Theories via Neural   Operator-based Flows
        - URL: http://arxiv.org/abs/2401.00828v4
 - Date: Thu, 07 Nov 2024 08:29:41 GMT
 - Title: Multi-Lattice Sampling of Quantum Field Theories via Neural   Operator-based Flows
 - Authors: Bálint Máté, François Fleuret, 
 - Abstract summary: We consider the problem of sampling lattice field configurations on a lattice from the Boltzmann distribution corresponding to some action.
We propose to approximate a time-dependent neural operator whose time integral provides a mapping between the functional distributions of the free and target theories.
 - Score: 22.333897842462342
 - License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
 - Abstract:   We consider the problem of sampling lattice field configurations on a lattice from the Boltzmann distribution corresponding to some action. Since such densities arise as approximationw of an underlying functional density, we frame the task as an instance of operator learning. We propose to approximate a time-dependent neural operator whose time integral provides a mapping between the functional distributions of the free and target theories. Once a particular lattice is chosen, the neural operator can be discretized to a finite-dimensional, time-dependent vector field which in turn induces a continuous normalizing flow between finite dimensional distributions over the chosen lattice. This flow can then be trained to be a diffeormorphism between the discretized free and target theories on the chosen lattice, and, by construction, can be evaluated on different discretizations of spacetime. We experimentally validate the proposal on the 2-dimensional $\phi^4$-theory to explore to what extent such operator-based flow architectures generalize to lattice sizes they were not trained on, and show that pretraining on smaller lattices can lead to a speedup over training directly on the target lattice size. 
 
       
      
        Related papers
        - Spectral quantum algorithm for passive scalar transport in shear flows [0.0]
The mixing of scalar substances in fluid flows by stirring and diffusion is ubiquitous in natural flows, chemical engineering, and microfluidic drug delivery.<n>We present a spectral quantum algorithm for scalar mixing by solving the advection-diffusion equation in a quantum computational fluid dynamics framework.<n>This evaluation shows that spectral accuracy allows comparably large time steps even though the operator splitting limits the temporal order.
arXiv  Detail & Related papers  (2025-05-15T10:09:52Z) - Riemannian Neural Geodesic Interpolant [15.653104625330062]
Differential interpolants are efficient generative models that bridge two arbitrary probability density functions in finite time.
These models are primarily developed in Euclidean space, and are therefore limited in their application to many distribution learning problems.
We introduce the Riemannian Geodesic Interpolant (RNGI) model, which interpolates between two probability densities.
arXiv  Detail & Related papers  (2025-04-22T09:28:29Z) - On the Contractivity of Stochastic Interpolation Flow [1.90365714903665]
We investigate, a recently introduced framework for high dimensional sampling which bears many similarities to diffusion modeling.
We show that for a base distribution and a strongly log-concave target distribution, the flow map is Lipschitz with a sharp constant which matches that of Caffarelli's theorem for optimal transport maps.
We are further able to construct Lipschitz transport maps between non-Gaussian distributions, generalizing some recent constructions in the literature on transport methods for establishing functional inequalities.
arXiv  Detail & Related papers  (2025-04-14T19:10:22Z) - Sampling and estimation on manifolds using the Langevin diffusion [45.57801520690309]
Two estimators of linear functionals of $mu_phi $ based on the discretized Markov process are considered.
Error bounds are derived for sampling and estimation using a discretization of an intrinsically defined Langevin diffusion.
arXiv  Detail & Related papers  (2023-12-22T18:01:11Z) - Random Smoothing Regularization in Kernel Gradient Descent Learning [24.383121157277007]
We present a framework for random smoothing regularization that can adaptively learn a wide range of ground truth functions belonging to the classical Sobolev spaces.
Our estimator can adapt to the structural assumptions of the underlying data and avoid the curse of dimensionality.
arXiv  Detail & Related papers  (2023-05-05T13:37:34Z) - Normalizing flows for lattice gauge theory in arbitrary space-time
  dimension [135.04925500053622]
Applications of normalizing flows to the sampling of field configurations in lattice gauge theory have so far been explored almost exclusively in two space-time dimensions.
We discuss masked autoregressive with tractable and unbiased Jacobian determinants, a key ingredient for scalable and exact flow-based sampling algorithms.
For concreteness, results from a proof-of-principle application to SU(3) gauge theory in four space-time dimensions are reported.
arXiv  Detail & Related papers  (2023-05-03T19:54:04Z) - Classifying topological neural network quantum states via diffusion maps [0.0]
We discuss and demonstrate an unsupervised machine-learning procedure to detect topological order in quantum many-body systems.
We use a restricted Boltzmann machine to define a variational ansatz for the low-energy spectrum.
We show that for the diffusion map, the required similarity measure of quantum states can be defined in terms of the network parameters.
arXiv  Detail & Related papers  (2023-01-06T19:00:21Z) - Designing Universal Causal Deep Learning Models: The Case of   Infinite-Dimensional Dynamical Systems from Stochastic Analysis [7.373617024876726]
Several non-linear operators in analysis depend on a temporal structure which is not leveraged by contemporary neural operators.
This paper introduces a deep learning model-design framework that takes suitable infinite-dimensional linear metric spaces.
We show that our framework can uniformly approximate on compact sets and across arbitrarily finite-time horizons H" or smooth trace class operators.
arXiv  Detail & Related papers  (2022-10-24T14:43:03Z) - Neural Conservation Laws: A Divergence-Free Perspective [36.668126758052814]
We propose building divergence-free neural networks through the concept of differential forms.
We prove these models are universal and so can be used to represent any divergence-free vector field.
arXiv  Detail & Related papers  (2022-10-04T17:01:53Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv  Detail & Related papers  (2022-02-23T06:11:49Z) - Continuous normalizing flows on manifolds [0.342658286826597]
We describe how the recently introduced Neural ODEs and continuous normalizing flows can be extended to arbitrary smooth manifold.
We propose a general methodology for parameterizing vector fields on these spaces and demonstrate how gradient-based learning can be performed.
arXiv  Detail & Related papers  (2021-03-14T15:35:19Z) - A Convergence Theory Towards Practical Over-parameterized Deep Neural
  Networks [56.084798078072396]
We take a step towards closing the gap between theory and practice by significantly improving the known theoretical bounds on both the network width and the convergence time.
We show that convergence to a global minimum is guaranteed for networks with quadratic widths in the sample size and linear in their depth at a time logarithmic in both.
Our analysis and convergence bounds are derived via the construction of a surrogate network with fixed activation patterns that can be transformed at any time to an equivalent ReLU network of a reasonable size.
arXiv  Detail & Related papers  (2021-01-12T00:40:45Z) - Scaling limits of lattice quantum fields by wavelets [62.997667081978825]
The renormalization group is considered as an inductive system of scaling maps between lattice field algebras.
We show that the inductive limit of free lattice ground states exists and the limit state extends to the familiar massive continuum free field.
arXiv  Detail & Related papers  (2020-10-21T16:30:06Z) - Closed-Form Factorization of Latent Semantics in GANs [65.42778970898534]
A rich set of interpretable dimensions has been shown to emerge in the latent space of the Generative Adversarial Networks (GANs) trained for synthesizing images.
In this work, we examine the internal representation learned by GANs to reveal the underlying variation factors in an unsupervised manner.
We propose a closed-form factorization algorithm for latent semantic discovery by directly decomposing the pre-trained weights.
arXiv  Detail & Related papers  (2020-07-13T18:05:36Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
 Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv  Detail & Related papers  (2020-03-07T01:56:20Z) 
        This list is automatically generated from the titles and abstracts of the papers in this site.
       
     
           This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.