Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows
- URL: http://arxiv.org/abs/2309.09222v2
- Date: Tue, 2 Jan 2024 13:09:37 GMT
- Title: Data-driven Modeling and Inference for Bayesian Gaussian Process ODEs
via Double Normalizing Flows
- Authors: Jian Xu, Shian Du, Junmei Yang, Xinghao Ding, John Paisley, Delu Zeng
- Abstract summary: We introduce normalizing flows to re parameterize the ODE vector field, resulting in a data-driven prior distribution.
We also apply normalizing flows to the posterior inference of GP ODEs to resolve the issue of strong mean-field assumptions.
We validate the effectiveness of our approach on simulated dynamical systems and real-world human motion data.
- Score: 28.62579476863723
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, Gaussian processes have been used to model the vector field of
continuous dynamical systems, referred to as GPODEs, which are characterized by
a probabilistic ODE equation. Bayesian inference for these models has been
extensively studied and applied in tasks such as time series prediction.
However, the use of standard GPs with basic kernels like squared exponential
kernels has been common in GPODE research, limiting the model's ability to
represent complex scenarios. To address this limitation, we introduce
normalizing flows to reparameterize the ODE vector field, resulting in a
data-driven prior distribution, thereby increasing flexibility and expressive
power. We develop a data-driven variational learning algorithm that utilizes
analytically tractable probability density functions of normalizing flows,
enabling simultaneous learning and inference of unknown continuous dynamics.
Additionally, we also apply normalizing flows to the posterior inference of GP
ODEs to resolve the issue of strong mean-field assumptions in posterior
inference. By applying normalizing flows in both these ways, our model improves
accuracy and uncertainty estimates for Bayesian Gaussian Process ODEs. We
validate the effectiveness of our approach on simulated dynamical systems and
real-world human motion data, including time series prediction and missing data
recovery tasks. Experimental results show that our proposed method effectively
captures model uncertainty while improving accuracy.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Deep Latent Force Models: ODE-based Process Convolutions for Bayesian
Deep Learning [0.0]
The deep latent force model (DLFM) is a deep Gaussian process with physics-informed kernels at each layer.
We present empirical evidence of the capability of the DLFM to capture the dynamics present in highly nonlinear real-world time series data.
We find that the DLFM is capable of achieving comparable performance to a range of non-physics-informed probabilistic models.
arXiv Detail & Related papers (2023-11-24T19:55:57Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Neural Flows: Efficient Alternative to Neural ODEs [8.01886971335823]
We propose an alternative by directly modeling the solution curves - the flow of an ODE - with a neural network.
This immediately eliminates the need for expensive numerical solvers while still maintaining the modeling capability of neural ODEs.
arXiv Detail & Related papers (2021-10-25T15:24:45Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - SODEN: A Scalable Continuous-Time Survival Model through Ordinary
Differential Equation Networks [14.564168076456822]
We propose a flexible model for survival analysis using neural networks along with scalable optimization algorithms.
We demonstrate the effectiveness of the proposed method in comparison to existing state-of-the-art deep learning survival analysis models.
arXiv Detail & Related papers (2020-08-19T19:11:25Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.