Learning effective stochastic differential equations from microscopic
simulations: combining stochastic numerics and deep learning
- URL: http://arxiv.org/abs/2106.09004v1
- Date: Thu, 10 Jun 2021 13:00:18 GMT
- Title: Learning effective stochastic differential equations from microscopic
simulations: combining stochastic numerics and deep learning
- Authors: Felix Dietrich and Alexei Makeev and George Kevrekidis and Nikolaos
Evangelou and Tom Bertalan and Sebastian Reich and Ioannis G. Kevrekidis
- Abstract summary: We approximate drift and diffusivity functions in effective SDE through neural networks.
Our approach does not require long trajectories, works on scattered snapshot data, and is designed to naturally handle different time steps per snapshot.
- Score: 0.46180371154032895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We identify effective stochastic differential equations (SDE) for coarse
observables of fine-grained particle- or agent-based simulations; these SDE
then provide coarse surrogate models of the fine scale dynamics. We approximate
the drift and diffusivity functions in these effective SDE through neural
networks, which can be thought of as effective stochastic ResNets. The loss
function is inspired by, and embodies, the structure of established stochastic
numerical integrators (here, Euler-Maruyama and Milstein); our approximations
can thus benefit from error analysis of these underlying numerical schemes.
They also lend themselves naturally to "physics-informed" gray-box
identification when approximate coarse models, such as mean field equations,
are available. Our approach does not require long trajectories, works on
scattered snapshot data, and is designed to naturally handle different time
steps per snapshot. We consider both the case where the coarse collective
observables are known in advance, as well as the case where they must be found
in a data-driven manner.
Related papers
- A Training-Free Conditional Diffusion Model for Learning Stochastic Dynamical Systems [10.820654486318336]
This study introduces a training-free conditional diffusion model for learning unknown differential equations (SDEs) using data.
The proposed approach addresses key challenges in computational efficiency and accuracy for modeling SDEs.
The learned models exhibit significant improvements in predicting both short-term and long-term behaviors of unknown systems.
arXiv Detail & Related papers (2024-10-04T03:07:36Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Noise in the reverse process improves the approximation capabilities of
diffusion models [27.65800389807353]
In Score based Generative Modeling (SGMs), the state-of-the-art in generative modeling, reverse processes are known to perform better than their deterministic counterparts.
This paper delves into the heart of this phenomenon, comparing neural ordinary differential equations (ODEs) and neural dimension equations (SDEs) as reverse processes.
We analyze the ability of neural SDEs to approximate trajectories of the Fokker-Planck equation, revealing the advantages of neurality.
arXiv Detail & Related papers (2023-12-13T02:39:10Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Tipping Points of Evolving Epidemiological Networks: Machine
Learning-Assisted, Data-Driven Effective Modeling [0.0]
We study the tipping point collective dynamics of an adaptive susceptible-infected (SIS) epidemiological network in a data-driven, machine learning-assisted manner.
We identify a complex effective differential equation (eSDE) in terms physically meaningful coarse mean-field variables.
We study the statistics of rare events both through repeated brute force simulations and by using established mathematical/computational tools.
arXiv Detail & Related papers (2023-11-01T19:33:03Z) - HyperSINDy: Deep Generative Modeling of Nonlinear Stochastic Governing
Equations [5.279268784803583]
We introduce HyperSINDy, a framework for modeling dynamics via a deep generative model of sparse governing equations from data.
Once trained, HyperSINDy generates dynamics via a differential equation whose coefficients are driven by a white noise.
In experiments, HyperSINDy recovers ground truth governing equations, with learnedity scaling to match that of the data.
arXiv Detail & Related papers (2023-10-07T14:41:59Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Learning stochastic dynamical systems with neural networks mimicking the
Euler-Maruyama scheme [14.436723124352817]
We propose a data driven approach where parameters of the SDE are represented by a neural network with a built-in SDE integration scheme.
The algorithm is applied to the geometric brownian motion and a version of the Lorenz-63 model.
arXiv Detail & Related papers (2021-05-18T11:41:34Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.