GANs and Closures: Micro-Macro Consistency in Multiscale Modeling
- URL: http://arxiv.org/abs/2208.10715v4
- Date: Sun, 10 Dec 2023 00:31:49 GMT
- Title: GANs and Closures: Micro-Macro Consistency in Multiscale Modeling
- Authors: Ellis R. Crabtree, Juan M. Bello-Rivas, Andrew L. Ferguson, Ioannis G.
Kevrekidis
- Abstract summary: We present an approach that couples physics-based simulations and biasing methods for sampling conditional distributions with Machine Learning-based conditional generative adversarial networks.
We show that this framework can improve multiscale SDE dynamical systems sampling, and even shows promise for systems of increasing complexity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sampling the phase space of molecular systems -- and, more generally, of
complex systems effectively modeled by stochastic differential equations -- is
a crucial modeling step in many fields, from protein folding to materials
discovery. These problems are often multiscale in nature: they can be described
in terms of low-dimensional effective free energy surfaces parametrized by a
small number of "slow" reaction coordinates; the remaining "fast" degrees of
freedom populate an equilibrium measure on the reaction coordinate values.
Sampling procedures for such problems are used to estimate effective free
energy differences as well as ensemble averages with respect to the conditional
equilibrium distributions; these latter averages lead to closures for effective
reduced dynamic models. Over the years, enhanced sampling techniques coupled
with molecular simulation have been developed. An intriguing analogy arises
with the field of Machine Learning (ML), where Generative Adversarial Networks
can produce high dimensional samples from low dimensional probability
distributions. This sample generation returns plausible high dimensional space
realizations of a model state, from information about its low-dimensional
representation. In this work, we present an approach that couples physics-based
simulations and biasing methods for sampling conditional distributions with
ML-based conditional generative adversarial networks for the same task. The
"coarse descriptors" on which we condition the fine scale realizations can
either be known a priori, or learned through nonlinear dimensionality
reduction. We suggest that this may bring out the best features of both
approaches: we demonstrate that a framework that couples cGANs with
physics-based enhanced sampling techniques can improve multiscale SDE dynamical
systems sampling, and even shows promise for systems of increasing complexity.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Momentum Particle Maximum Likelihood [2.4561590439700076]
We propose an analogous dynamical-systems-inspired approach to minimizing the free energy functional.
By discretizing the system, we obtain a practical algorithm for Maximum likelihood estimation in latent variable models.
The algorithm outperforms existing particle methods in numerical experiments and compares favourably with other MLE algorithms.
arXiv Detail & Related papers (2023-12-12T14:53:18Z) - Micro-Macro Consistency in Multiscale Modeling: Score-Based Model
Assisted Sampling of Fast/Slow Dynamical Systems [0.0]
In the study of physics-based multi-time-scale dynamical systems, techniques have been developed for enhancing sampling.
In the field of Machine Learning, a generic goal of generative models is to sample from a target density, after training on empirical samples from this density.
In this work, we show that that SGMs can be used in such a coupling framework to improve sampling in multiscale dynamical systems.
arXiv Detail & Related papers (2023-12-10T00:46:37Z) - Weighted Riesz Particles [0.0]
We consider the target distribution as a mapping where the infinite-dimensional space of the parameters consists of a number of deterministic submanifolds.
We study the properties of the point, called Riesz, and embed it into sequential MCMC.
We find that there will be higher acceptance rates with fewer evaluations.
arXiv Detail & Related papers (2023-12-01T14:36:46Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.