Variational Inference for SDEs Driven by Fractional Noise
- URL: http://arxiv.org/abs/2310.12975v1
- Date: Thu, 19 Oct 2023 17:59:21 GMT
- Title: Variational Inference for SDEs Driven by Fractional Noise
- Authors: Rembert Daems and Manfred Opper and Guillaume Crevecoeur and Tolga
Birdal
- Abstract summary: We present a novel variational framework for performing inference in (neural) differential equations (SDEs) driven by Markov-approximate fractional Brownian motion (fBM)
We propose the use of neural networks to learn the drift, diffusion and control terms within our variational posterior leading to the variational training of neural-SDEs.
- Score: 16.434973057669676
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel variational framework for performing inference in (neural)
stochastic differential equations (SDEs) driven by Markov-approximate
fractional Brownian motion (fBM). SDEs offer a versatile tool for modeling
real-world continuous-time dynamic systems with inherent noise and randomness.
Combining SDEs with the powerful inference capabilities of variational methods,
enables the learning of representative function distributions through
stochastic gradient descent. However, conventional SDEs typically assume the
underlying noise to follow a Brownian motion (BM), which hinders their ability
to capture long-term dependencies. In contrast, fractional Brownian motion
(fBM) extends BM to encompass non-Markovian dynamics, but existing methods for
inferring fBM parameters are either computationally demanding or statistically
inefficient. In this paper, building upon the Markov approximation of fBM, we
derive the evidence lower bound essential for efficient variational inference
of posterior path measures, drawing from the well-established field of
stochastic analysis. Additionally, we provide a closed-form expression to
determine optimal approximation coefficients. Furthermore, we propose the use
of neural networks to learn the drift, diffusion and control terms within our
variational posterior, leading to the variational training of neural-SDEs. In
this framework, we also optimize the Hurst index, governing the nature of our
fractional noise. Beyond validation on synthetic data, we contribute a novel
architecture for variational latent video prediction,-an approach that, to the
best of our knowledge, enables the first variational neural-SDE application to
video perception.
Related papers
- Fully Bayesian Differential Gaussian Processes through Stochastic Differential Equations [7.439555720106548]
We propose a fully Bayesian approach that treats the kernel hyper parameters as random variables and constructs coupled differential equations (SDEs) to learn their posterior distribution and that of inducing points.
Our approach provides a time-varying, comprehensive, and realistic posterior approximation through coupling variables using SDE methods.
Our work opens up exciting research avenues for advancing Bayesian inference and offers a powerful modeling tool for continuous-time Gaussian processes.
arXiv Detail & Related papers (2024-08-12T11:41:07Z) - Noise in the reverse process improves the approximation capabilities of
diffusion models [27.65800389807353]
In Score based Generative Modeling (SGMs), the state-of-the-art in generative modeling, reverse processes are known to perform better than their deterministic counterparts.
This paper delves into the heart of this phenomenon, comparing neural ordinary differential equations (ODEs) and neural dimension equations (SDEs) as reverse processes.
We analyze the ability of neural SDEs to approximate trajectories of the Fokker-Planck equation, revealing the advantages of neurality.
arXiv Detail & Related papers (2023-12-13T02:39:10Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Generative Fractional Diffusion Models [53.36835573822926]
We introduce the first continuous-time score-based generative model that leverages fractional diffusion processes for its underlying dynamics.
Our evaluations on real image datasets demonstrate that GFDM achieves greater pixel-wise diversity and enhanced image quality, as indicated by a lower FID.
arXiv Detail & Related papers (2023-10-26T17:53:24Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Modiff: Action-Conditioned 3D Motion Generation with Denoising Diffusion
Probabilistic Models [58.357180353368896]
We propose a conditional paradigm that benefits from the denoising diffusion probabilistic model (DDPM) to tackle the problem of realistic and diverse action-conditioned 3D skeleton-based motion generation.
We are a pioneering attempt that uses DDPM to synthesize a variable number of motion sequences conditioned on a categorical action.
arXiv Detail & Related papers (2023-01-10T13:15:42Z) - Continuous-time stochastic gradient descent for optimizing over the
stationary distribution of stochastic differential equations [7.65995376636176]
We develop a new continuous-time gradient descent method for optimizing over the stationary distribution oficity differential equation (SDE) models.
We rigorously prove convergence of the online forward propagation algorithm for linear SDE models and present its numerical results for nonlinear examples.
arXiv Detail & Related papers (2022-02-14T11:45:22Z) - Variational Inference for Continuous-Time Switching Dynamical Systems [29.984955043675157]
We present a model based on an Markov jump process modulating a subordinated diffusion process.
We develop a new continuous-time variational inference algorithm.
We extensively evaluate our algorithm under the model assumption and for real-world examples.
arXiv Detail & Related papers (2021-09-29T15:19:51Z) - Training Deep Energy-Based Models with f-Divergence Minimization [113.97274898282343]
Deep energy-based models (EBMs) are very flexible in distribution parametrization but computationally challenging.
We propose a general variational framework termed f-EBM to train EBMs using any desired f-divergence.
Experimental results demonstrate the superiority of f-EBM over contrastive divergence, as well as the benefits of training EBMs using f-divergences other than KL.
arXiv Detail & Related papers (2020-03-06T23:11:13Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.