Micro-Macro Consistency in Multiscale Modeling: Score-Based Model
Assisted Sampling of Fast/Slow Dynamical Systems
- URL: http://arxiv.org/abs/2312.05715v2
- Date: Wed, 27 Dec 2023 17:18:38 GMT
- Title: Micro-Macro Consistency in Multiscale Modeling: Score-Based Model
Assisted Sampling of Fast/Slow Dynamical Systems
- Authors: Ellis R. Crabtree, Juan M. Bello-Rivas, Ioannis G. Kevrekidis
- Abstract summary: In the study of physics-based multi-time-scale dynamical systems, techniques have been developed for enhancing sampling.
In the field of Machine Learning, a generic goal of generative models is to sample from a target density, after training on empirical samples from this density.
In this work, we show that that SGMs can be used in such a coupling framework to improve sampling in multiscale dynamical systems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A valuable step in the modeling of multiscale dynamical systems in fields
such as computational chemistry, biology, materials science and more, is the
representative sampling of the phase space over long timescales of interest;
this task is not, however, without challenges. For example, the long term
behavior of a system with many degrees of freedom often cannot be efficiently
computationally explored by direct dynamical simulation; such systems can often
become trapped in local free energy minima. In the study of physics-based
multi-time-scale dynamical systems, techniques have been developed for
enhancing sampling in order to accelerate exploration beyond free energy
barriers. On the other hand, in the field of Machine Learning, a generic goal
of generative models is to sample from a target density, after training on
empirical samples from this density. Score based generative models (SGMs) have
demonstrated state-of-the-art capabilities in generating plausible data from
target training distributions. Conditional implementations of such generative
models have been shown to exhibit significant parallels with long-established
-- and physics based -- solutions to enhanced sampling. These physics-based
methods can then be enhanced through coupling with the ML generative models,
complementing the strengths and mitigating the weaknesses of each technique. In
this work, we show that that SGMs can be used in such a coupling framework to
improve sampling in multiscale dynamical systems.
Related papers
- Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Action Matching: Learning Stochastic Dynamics from Samples [10.46643972142224]
Action Matching is a method for learning a rich family of dynamics using only independent samples from its time evolution.
We derive a tractable training objective, which does not rely on explicit assumptions about the underlying dynamics.
Inspired by connections with optimal transport, we derive extensions of Action Matching to learn differential equations and dynamics involving creation and destruction of probability mass.
arXiv Detail & Related papers (2022-10-13T01:49:48Z) - GANs and Closures: Micro-Macro Consistency in Multiscale Modeling [0.0]
We present an approach that couples physics-based simulations and biasing methods for sampling conditional distributions with Machine Learning-based conditional generative adversarial networks.
We show that this framework can improve multiscale SDE dynamical systems sampling, and even shows promise for systems of increasing complexity.
arXiv Detail & Related papers (2022-08-23T03:45:39Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Using scientific machine learning for experimental bifurcation analysis
of dynamic systems [2.204918347869259]
This study focuses on training universal differential equation (UDE) models for physical nonlinear dynamical systems with limit cycles.
We consider examples where training data is generated by numerical simulations, whereas we also employ the proposed modelling concept to physical experiments.
We use both neural networks and Gaussian processes as universal approximators alongside the mechanistic models to give a critical assessment of the accuracy and robustness of the UDE modelling approach.
arXiv Detail & Related papers (2021-10-22T15:43:03Z) - Reservoir Computing with Diverse Timescales for Prediction of Multiscale
Dynamics [5.172455794487599]
We propose a reservoir computing model with diverse timescales by using a recurrent network of heterogeneous leaky integrator neurons.
In prediction tasks with fast-slow chaotic dynamical systems, we demonstrate that the proposed model has a higher potential than the existing standard model.
Our analysis reveals that the timescales required for producing each component of target dynamics are appropriately and flexibly selected from the reservoir dynamics by model training.
arXiv Detail & Related papers (2021-08-21T06:52:21Z) - Abstraction of Markov Population Dynamics via Generative Adversarial
Nets [2.1485350418225244]
A strategy to reduce computational load is to abstract the population model, replacing it with a simpler model, faster to simulate.
Here we pursue this idea, building on previous works and constructing a generator capable of producing trajectories in continuous space and discrete time.
This generator is learned automatically from simulations of the original model in a Generative Adversarial setting.
arXiv Detail & Related papers (2021-06-24T12:57:49Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.