Group-Equivariant Diffusion Models for Lattice Field Theory
- URL: http://arxiv.org/abs/2510.26081v1
- Date: Thu, 30 Oct 2025 02:34:01 GMT
- Title: Group-Equivariant Diffusion Models for Lattice Field Theory
- Authors: Octavio Vega, Javad Komijani, Aida El-Khadra, Marina Marinkovic,
- Abstract summary: Near the critical point, Markov Chain Monte Carlo simulations of lattice quantum field theories become increasingly inefficient.<n>In this work, we investigate score-based symmetry-preserving diffusion models as an alternative strategy to sample two-dimensional $phi4$ and $rm U(1)$ lattice field theories.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Near the critical point, Markov Chain Monte Carlo (MCMC) simulations of lattice quantum field theories (LQFT) become increasingly inefficient due to critical slowing down. In this work, we investigate score-based symmetry-preserving diffusion models as an alternative strategy to sample two-dimensional $\phi^4$ and ${\rm U}(1)$ lattice field theories. We develop score networks that are equivariant to a range of group transformations, including global $\mathbb{Z}_2$ reflections, local ${\rm U}(1)$ rotations, and periodic translations $\mathbb{T}$. The score networks are trained using an augmented training scheme, which significantly improves sample quality in the simulated field theories. We also demonstrate empirically that our symmetry-aware models outperform generic score networks in sample quality, expressivity, and effective sample size.
Related papers
- Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation [56.361076943802594]
CanonFlow achieves state-of-the-art performance on the challenging GEOM-DRUG dataset, and the advantage remains large in few-step generation.
arXiv Detail & Related papers (2026-02-16T18:58:55Z) - Field digitization scaling in a $\mathbb{Z}_N \subset U(1)$ symmetric model [0.0]
We propose to analyze field digitization by interpreting the parameter $N$ as a coupling in the renormalization group sense.<n>Using effective field theory, we derive generalized scaling hypotheses involving the FD parameter $N$.<n>We analytically prove that our calculations for the 2D classical-statistical $mathbbZ_N$ clock model are directly related to the quantum physics in the ground state of a (2+1)D $mathbbZ_N$ lattice gauge theory.
arXiv Detail & Related papers (2025-07-30T18:00:02Z) - Efficient Diffusion Models for Symmetric Manifolds [25.99200001269046]
We introduce a framework for designing efficient diffusion models for $d$-dimensional symmetric-space.<n>Mandela symmetries ensure the diffusion satisfies an "average-case" Lipschitz condition.<n>Our model outperforms prior methods in training speed and improves sample quality on synthetic datasets.
arXiv Detail & Related papers (2025-05-27T18:12:29Z) - Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Discrete Diffusion Modeling by Estimating the Ratios of the Data Distribution [67.9215891673174]
We propose score entropy as a novel loss that naturally extends score matching to discrete spaces.
We test our Score Entropy Discrete Diffusion models on standard language modeling tasks.
arXiv Detail & Related papers (2023-10-25T17:59:12Z) - Diffusion Models as Stochastic Quantization in Lattice Field Theory [7.221319972004889]
We establish a direct connection between generative diffusion models (DMs) and quantization (SQ).
The DM is realized by approximating the reversal of a process dictated by the Langevin equation, generating samples from a prior distribution to effectively mimic the target distribution.
We demonstrate that DMs can notably reduce autocorrelation times in the Markov chain, especially in the critical region where standard Markov Chain Monte-Carlo algorithms experience critical slowing down.
arXiv Detail & Related papers (2023-09-29T09:26:59Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Machine Learning and Variational Algorithms for Lattice Field Theory [1.198562319289569]
In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
arXiv Detail & Related papers (2021-06-03T16:37:05Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.