SESaMo: Symmetry-Enforcing Stochastic Modulation for Normalizing Flows
- URL: http://arxiv.org/abs/2505.19619v2
- Date: Thu, 05 Jun 2025 15:24:20 GMT
- Title: SESaMo: Symmetry-Enforcing Stochastic Modulation for Normalizing Flows
- Authors: Janik Kreit, Dominic Schuh, Kim A. Nicoli, Lena Funcke,
- Abstract summary: This paper introduces Symmetry-Enforcing Modulation (SESaMo)<n>SESaMo enables the incorporation of biases (e.g., symmetries) into normalizing flows through a novel technique called inductive modulation.<n>Our numerical experiments benchmark SESaMo in different scenarios, including an 8-Gaussian mixture model and physically relevant field theories.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Deep generative models have recently garnered significant attention across various fields, from physics to chemistry, where sampling from unnormalized Boltzmann-like distributions represents a fundamental challenge. In particular, autoregressive models and normalizing flows have become prominent due to their appealing ability to yield closed-form probability densities. Moreover, it is well-established that incorporating prior knowledge - such as symmetries - into deep neural networks can substantially improve training performances. In this context, recent advances have focused on developing symmetry-equivariant generative models, achieving remarkable results. Building upon these foundations, this paper introduces Symmetry-Enforcing Stochastic Modulation (SESaMo). Similar to equivariant normalizing flows, SESaMo enables the incorporation of inductive biases (e.g., symmetries) into normalizing flows through a novel technique called stochastic modulation. This approach enhances the flexibility of the generative model, allowing to effectively learn a variety of exact and broken symmetries. Our numerical experiments benchmark SESaMo in different scenarios, including an 8-Gaussian mixture model and physically relevant field theories, such as the $\phi^4$ theory and the Hubbard model.
Related papers
- Principled model selection for stochastic dynamics [0.0]
PASTIS is a principled method combining likelihood-estimation statistics with extreme value theory to suppress superfluous parameters.<n>It reliably identifies minimal models, even with low sampling rates or measurement error.<n>It applies to partial differential equations, and applies to ecological networks and reaction-diffusion dynamics.
arXiv Detail & Related papers (2025-01-17T18:23:16Z) - Simulating the Hubbard Model with Equivariant Normalizing Flows [0.0]
normalizing flows have been successfully applied to accurately learn the Boltzmann distribution.<n>We present a proof-of-concept demonstration that normalizing flows can be used to learn the Boltzmann distribution for the Hubbard model.
arXiv Detail & Related papers (2025-01-13T14:40:42Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
We present a unifying perspective on recent results on ridge regression.<n>We use the basic tools of random matrix theory and free probability, aimed at readers with backgrounds in physics and deep learning.<n>Our results extend and provide a unifying perspective on earlier models of scaling laws.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Time-changed normalizing flows for accurate SDE modeling [5.402030962296633]
We propose a novel transformation of dynamic normalizing flows, based on time deformation of a Brownian motion.
This approach enables us to effectively model some SDEs, that cannot be modeled otherwise.
arXiv Detail & Related papers (2023-12-22T13:57:29Z) - Uncertainty-aware Surrogate Models for Airfoil Flow Simulations with Denoising Diffusion Probabilistic Models [26.178192913986344]
We make a first attempt to use denoising diffusion probabilistic models (DDPMs) to train an uncertainty-aware surrogate model for turbulence simulations.
Our results show DDPMs can successfully capture the whole distribution of solutions and, as a consequence, accurately estimate the uncertainty of the simulations.
We also evaluate an emerging generative modeling variant, flow matching, in comparison to regular diffusion models.
arXiv Detail & Related papers (2023-12-08T19:04:17Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - GeoDiff: a Geometric Diffusion Model for Molecular Conformation
Generation [102.85440102147267]
We propose a novel generative model named GeoDiff for molecular conformation prediction.
We show that GeoDiff is superior or comparable to existing state-of-the-art approaches.
arXiv Detail & Related papers (2022-03-06T09:47:01Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Equivariant Flows: Exact Likelihood Generative Learning for Symmetric
Densities [1.7188280334580197]
Normalizing flows are exact-likelihood generative neural networks which transform samples from a simple prior distribution to samples of the probability distribution of interest.
Recent work showed that such generative models can be utilized in statistical mechanics to sample equilibrium states of many-body systems in physics and chemistry.
arXiv Detail & Related papers (2020-06-03T17:54:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.