Neural SDEs as Infinite-Dimensional GANs
- URL: http://arxiv.org/abs/2102.03657v1
- Date: Sat, 6 Feb 2021 19:59:15 GMT
- Title: Neural SDEs as Infinite-Dimensional GANs
- Authors: Patrick Kidger and James Foster and Xuechen Li and Harald Oberhauser
and Terry Lyons
- Abstract summary: We show that the current classical approach to fitting SDEs may be approached as a special case of (Wasserstein) GANs.
We obtain Neural SDEs as (in modern machine learning parlance) continuous-time generative time series models.
- Score: 18.07683058213448
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic differential equations (SDEs) are a staple of mathematical
modelling of temporal dynamics. However, a fundamental limitation has been that
such models have typically been relatively inflexible, which recent work
introducing Neural SDEs has sought to solve. Here, we show that the current
classical approach to fitting SDEs may be approached as a special case of
(Wasserstein) GANs, and in doing so the neural and classical regimes may be
brought together. The input noise is Brownian motion, the output samples are
time-evolving paths produced by a numerical solver, and by parameterising a
discriminator as a Neural Controlled Differential Equation (CDE), we obtain
Neural SDEs as (in modern machine learning parlance) continuous-time generative
time series models. Unlike previous work on this problem, this is a direct
extension of the classical approach without reference to either prespecified
statistics or density functions. Arbitrary drift and diffusions are admissible,
so as the Wasserstein loss has a unique global minima, in the infinite data
limit \textit{any} SDE may be learnt.
Related papers
- Differentially Private Gradient Flow based on the Sliced Wasserstein Distance [59.1056830438845]
We introduce a novel differentially private generative modeling approach based on a gradient flow in the space of probability measures.
Experiments show that our proposed model can generate higher-fidelity data at a low privacy budget.
arXiv Detail & Related papers (2023-12-13T15:47:30Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - GNRK: Graph Neural Runge-Kutta method for solving partial differential
equations [0.0]
This study introduces a novel approach called Graph Neural Runge-Kutta (GNRK)
GNRK integrates graph neural network modules with a recurrent structure inspired by the classical solvers.
It demonstrates the capability to address general PDEs, irrespective of initial conditions or PDE coefficients.
arXiv Detail & Related papers (2023-10-01T08:52:46Z) - Latent SDEs on Homogeneous Spaces [9.361372513858043]
We consider the problem of variational Bayesian inference in a latent variable model where a (possibly complex) observed geometric process is governed by the solution of a latent differential equation (SDE)
Experiments demonstrate that a latent SDE of the proposed type can be learned efficiently by means of an existing one-step Euler-Maruyama scheme.
arXiv Detail & Related papers (2023-06-28T14:18:52Z) - Time and State Dependent Neural Delay Differential Equations [0.5249805590164901]
Delayed terms are encountered in the governing equations of a large class of problems ranging from physics and engineering to medicine and economics.
We introduce Neural State-Dependent DDE, a framework that can model multiple and state- and time-dependent delays.
We show that our method is competitive and outperforms other continuous-class models on a wide variety of delayed dynamical systems.
arXiv Detail & Related papers (2023-06-26T09:35:56Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - NeuralPDE: Modelling Dynamical Systems from Data [0.44259821861543996]
We propose NeuralPDE, a model which combines convolutional neural networks (CNNs) with differentiable ODE solvers to model dynamical systems.
We show that the Method of Lines used in standard PDE solvers can be represented using convolutions which makes CNNs the natural choice to parametrize arbitrary PDE dynamics.
Our model can be applied to any data without requiring any prior knowledge about the governing PDE.
arXiv Detail & Related papers (2021-11-15T10:59:52Z) - Neural Stochastic Partial Differential Equations [1.2183405753834562]
We introduce the Neural SPDE model providing an extension to two important classes of physics-inspired neural architectures.
On the one hand, it extends all the popular neural -- ordinary, controlled, rough -- differential equation models in that it is capable of processing incoming information.
On the other hand, it extends Neural Operators -- recent generalizations of neural networks modelling mappings between functional spaces -- in that it can be used to learn complex SPDE solution operators.
arXiv Detail & Related papers (2021-10-19T20:35:37Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.