Neural Stochastic Differential Equations with Change Points: A
Generative Adversarial Approach
- URL: http://arxiv.org/abs/2312.13152v2
- Date: Mon, 22 Jan 2024 15:04:57 GMT
- Title: Neural Stochastic Differential Equations with Change Points: A
Generative Adversarial Approach
- Authors: Zhongchang Sun, Yousef El-Laham, Svitlana Vyetrenko
- Abstract summary: We propose a change point detection algorithm for time series modeled as neural SDEs.
The proposed method jointly learns the unknown change points and the parameters of distinct neural SDE models corresponding to each change point.
- Score: 5.408169633844698
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic differential equations (SDEs) have been widely used to model real
world random phenomena. Existing works mainly focus on the case where the time
series is modeled by a single SDE, which might be restrictive for modeling time
series with distributional shift. In this work, we propose a change point
detection algorithm for time series modeled as neural SDEs. Given a time series
dataset, the proposed method jointly learns the unknown change points and the
parameters of distinct neural SDE models corresponding to each change point.
Specifically, the SDEs are learned under the framework of generative
adversarial networks (GANs) and the change points are detected based on the
output of the GAN discriminator in a forward pass. At each step of the proposed
algorithm, the change points and the SDE model parameters are updated in an
alternating fashion. Numerical results on both synthetic and real datasets are
provided to validate the performance of our algorithm in comparison to
classical change point detection benchmarks, standard GAN-based neural SDEs,
and other state-of-the-art deep generative models for time series data.
Related papers
- Variational Neural Stochastic Differential Equations with Change Points [4.692174333076032]
We explore modeling change points in time-series data using neural differential equations (neural SDEs)
We propose a novel model formulation and training procedure based on the variational autoencoder (VAE) framework for modeling time-series as a neural SDE.
We present an empirical evaluation that demonstrates the expressive power of our proposed model, showing that it can effectively model both classical parametric SDEs and some real datasets with distribution shifts.
arXiv Detail & Related papers (2024-11-01T14:46:17Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Deep Learning-based surrogate models for parametrized PDEs: handling
geometric variability through graph neural networks [0.0]
This work explores the potential usage of graph neural networks (GNNs) for the simulation of time-dependent PDEs.
We propose a systematic strategy to build surrogate models based on a data-driven time-stepping scheme.
We show that GNNs can provide a valid alternative to traditional surrogate models in terms of computational efficiency and generalization to new scenarios.
arXiv Detail & Related papers (2023-08-03T08:14:28Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Latent Neural Stochastic Differential Equations for Change Point
Detection [0.6445605125467574]
We present a novel change point detection algorithm based on Latent Neural Differential Equations (SDE)
Our method learns a non-linear deep learning transformation of the process into a latent space and estimates a SDE that describes its evolution over time.
The algorithm uses the likelihood ratio of the learned processes in different timestamps to find change points of the process.
arXiv Detail & Related papers (2022-08-22T13:53:13Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Learning stochastic dynamical systems with neural networks mimicking the
Euler-Maruyama scheme [14.436723124352817]
We propose a data driven approach where parameters of the SDE are represented by a neural network with a built-in SDE integration scheme.
The algorithm is applied to the geometric brownian motion and a version of the Lorenz-63 model.
arXiv Detail & Related papers (2021-05-18T11:41:34Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z) - Identifying Latent Stochastic Differential Equations [29.103393300261587]
We present a method for learning latent differential equations (SDEs) from high-dimensional time series data.
The proposed method learns the mapping from ambient to latent space, and the underlying SDE coefficients, through a self-supervised learning approach.
We validate the method through several simulated video processing tasks, where the underlying SDE is known, and through real world datasets.
arXiv Detail & Related papers (2020-07-12T19:46:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.