Variational Inference for Continuous-Time Switching Dynamical Systems
- URL: http://arxiv.org/abs/2109.14492v1
- Date: Wed, 29 Sep 2021 15:19:51 GMT
- Title: Variational Inference for Continuous-Time Switching Dynamical Systems
- Authors: Lukas K\"ohs, Bastian Alt, Heinz Koeppl
- Abstract summary: We present a model based on an Markov jump process modulating a subordinated diffusion process.
We develop a new continuous-time variational inference algorithm.
We extensively evaluate our algorithm under the model assumption and for real-world examples.
- Score: 29.984955043675157
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Switching dynamical systems provide a powerful, interpretable modeling
framework for inference in time-series data in, e.g., the natural sciences or
engineering applications. Since many areas, such as biology or discrete-event
systems, are naturally described in continuous time, we present a model based
on an Markov jump process modulating a subordinated diffusion process. We
provide the exact evolution equations for the prior and posterior marginal
densities, the direct solutions of which are however computationally
intractable. Therefore, we develop a new continuous-time variational inference
algorithm, combining a Gaussian process approximation on the diffusion level
with posterior inference for Markov jump processes. By minimizing the path-wise
Kullback-Leibler divergence we obtain (i) Bayesian latent state estimates for
arbitrary points on the real axis and (ii) point estimates of unknown system
parameters, utilizing variational expectation maximization. We extensively
evaluate our algorithm under the model assumption and for real-world examples.
Related papers
- Fully Bayesian Differential Gaussian Processes through Stochastic Differential Equations [7.439555720106548]
We propose a fully Bayesian approach that treats the kernel hyper parameters as random variables and constructs coupled differential equations (SDEs) to learn their posterior distribution and that of inducing points.
Our approach provides a time-varying, comprehensive, and realistic posterior approximation through coupling variables using SDE methods.
Our work opens up exciting research avenues for advancing Bayesian inference and offers a powerful modeling tool for continuous-time Gaussian processes.
arXiv Detail & Related papers (2024-08-12T11:41:07Z) - Entropic Matching for Expectation Propagation of Markov Jump Processes [38.60042579423602]
We propose a new tractable inference scheme based on an entropic matching framework.
We demonstrate the effectiveness of our method by providing closed-form results for a simple family of approximate distributions.
We derive expressions for point estimation of the underlying parameters using an approximate expectation procedure.
arXiv Detail & Related papers (2023-09-27T12:07:21Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Variational Gaussian Process Diffusion Processes [17.716059928867345]
Diffusion processes are a class of differential equations (SDEs) providing a rich family of expressive models.
Probabilistic inference and learning under generative models with latent processes endowed with a non-linear diffusion process prior are intractable problems.
We build upon work within variational inference, approximating the posterior process as a linear diffusion process, and point out pathologies in the approach.
arXiv Detail & Related papers (2023-06-03T09:43:59Z) - Free-Form Variational Inference for Gaussian Process State-Space Models [21.644570034208506]
We propose a new method for inference in Bayesian GPSSMs.
Our method is based on freeform variational inference via inducing Hamiltonian Monte Carlo.
We show that our approach can learn transition dynamics and latent states more accurately than competing methods.
arXiv Detail & Related papers (2023-02-20T11:34:16Z) - Distributed Bayesian Learning of Dynamic States [65.7870637855531]
The proposed algorithm is a distributed Bayesian filtering task for finite-state hidden Markov models.
It can be used for sequential state estimation, as well as for modeling opinion formation over social networks under dynamic environments.
arXiv Detail & Related papers (2022-12-05T19:40:17Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Markov Chain Monte Carlo for Continuous-Time Switching Dynamical Systems [26.744964200606784]
We propose a novel inference algorithm utilizing a Markov Chain Monte Carlo approach.
The presented Gibbs sampler allows to efficiently obtain samples from the exact continuous-time posterior processes.
arXiv Detail & Related papers (2022-05-18T09:03:00Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.