Modeling Temporal Data as Continuous Functions with Stochastic Process
Diffusion
- URL: http://arxiv.org/abs/2211.02590v2
- Date: Fri, 19 May 2023 11:34:58 GMT
- Title: Modeling Temporal Data as Continuous Functions with Stochastic Process
Diffusion
- Authors: Marin Bilo\v{s}, Kashif Rasul, Anderson Schneider, Yuriy Nevmyvaka,
Stephan G\"unnemann
- Abstract summary: temporal data can be viewed as discretized measurements of the underlying function.
To build a generative model for such data we have to model the process that governs it.
We propose a solution by defining the denoising diffusion model in the function space.
- Score: 2.2849153854336763
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal data such as time series can be viewed as discretized measurements
of the underlying function. To build a generative model for such data we have
to model the stochastic process that governs it. We propose a solution by
defining the denoising diffusion model in the function space which also allows
us to naturally handle irregularly-sampled observations. The forward process
gradually adds noise to functions, preserving their continuity, while the
learned reverse process removes the noise and returns functions as new samples.
To this end, we define suitable noise sources and introduce novel denoising and
score-matching models. We show how our method can be used for multivariate
probabilistic forecasting and imputation, and how our model can be interpreted
as a neural process.
Related papers
- One Noise to Rule Them All: Learning a Unified Model of Spatially-Varying Noise Patterns [33.293193191683145]
We present a single generative model which can learn to generate multiple types of noise as well as blend between them.
We also present an application of our model to improving inverse procedural material design.
arXiv Detail & Related papers (2024-04-25T02:23:11Z) - Logistic-beta processes for dependent random probabilities with beta marginals [58.91121576998588]
We propose a novel process called the logistic-beta process, whose logistic transformation yields a process with common beta marginals.
It can model dependence on both discrete and continuous domains, such as space or time, and has a flexible dependence structure through correlation kernels.
We illustrate the benefits through nonparametric binary regression and conditional density estimation examples, both in simulation studies and in a pregnancy outcome application.
arXiv Detail & Related papers (2024-02-10T21:41:32Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Learning Summary Statistics for Bayesian Inference with Autoencoders [58.720142291102135]
We use the inner dimension of deep neural network based Autoencoders as summary statistics.
To create an incentive for the encoder to encode all the parameter-related information but not the noise, we give the decoder access to explicit or implicit information that has been used to generate the training data.
arXiv Detail & Related papers (2022-01-28T12:00:31Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Autoregressive Denoising Diffusion Models for Multivariate Probabilistic
Time Series Forecasting [4.1573460459258245]
We use diffusion probabilistic models, a class of latent variable models closely connected to score matching and energy-based methods.
Our model learns gradients by optimizing a variational bound on the data likelihood and at inference time converts white noise into a sample of the distribution of interest.
arXiv Detail & Related papers (2021-01-28T15:46:10Z) - Automatic Differentiation to Simultaneously Identify Nonlinear Dynamics
and Extract Noise Probability Distributions from Data [4.996878640124385]
SINDy is a framework for the discovery of parsimonious dynamic models and equations from time-series data.
We develop a variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained by Rudy et al.
We show the method can identify a diversity of probability distributions including Gaussian, uniform, Gamma, and Rayleigh.
arXiv Detail & Related papers (2020-09-12T23:52:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.