A Probabilistic State Space Model for Joint Inference from Differential
Equations and Data
- URL: http://arxiv.org/abs/2103.10153v1
- Date: Thu, 18 Mar 2021 10:36:09 GMT
- Title: A Probabilistic State Space Model for Joint Inference from Differential
Equations and Data
- Authors: Jonathan Schmidt, Nicholas Kr\"amer, Philipp Hennig
- Abstract summary: We show a new class of solvers for ordinary differential equations (ODEs) that phrase the solution process directly in terms of Bayesian filtering.
It then becomes possible to perform approximate Bayesian inference on the latent force as well as the ODE solution in a single, linear complexity pass of an extended Kalman filter.
We demonstrate the expressiveness and performance of the algorithm by training a non-parametric SIRD model on data from the COVID-19 outbreak.
- Score: 23.449725313605835
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mechanistic models with differential equations are a key component of
scientific applications of machine learning. Inference in such models is
usually computationally demanding, because it involves repeatedly solving the
differential equation. The main problem here is that the numerical solver is
hard to combine with standard inference techniques. Recent work in
probabilistic numerics has developed a new class of solvers for ordinary
differential equations (ODEs) that phrase the solution process directly in
terms of Bayesian filtering. We here show that this allows such methods to be
combined very directly, with conceptual and numerical ease, with latent force
models in the ODE itself. It then becomes possible to perform approximate
Bayesian inference on the latent force as well as the ODE solution in a single,
linear complexity pass of an extended Kalman filter / smoother - that is, at
the cost of computing a single ODE solution. We demonstrate the expressiveness
and performance of the algorithm by training a non-parametric SIRD model on
data from the COVID-19 outbreak.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Parallel-in-Time Probabilistic Numerical ODE Solvers [35.716255949521305]
Probabilistic numerical solvers for ordinary differential equations (ODEs) treat the numerical simulation of dynamical systems as problems of Bayesian state estimation.
We build on the time-parallel formulation of iterated extended Kalman smoothers to formulate a parallel-in-time probabilistic numerical ODE solver.
arXiv Detail & Related papers (2023-10-02T12:32:21Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Discovering ordinary differential equations that govern time-series [65.07437364102931]
We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2022-11-05T07:07:58Z) - Learning nonparametric ordinary differential equations from noisy data [0.10555513406636088]
Learning nonparametric systems of Ordinary Differential Equations (ODEs) dot x = f(t,x) from noisy data is an emerging machine learning topic.
We use the theory of Reproducing Kernel Hilbert Spaces (RKHS) to define candidates for f for which the solution of the ODE exists and is unique.
We propose a penalty method that iteratively uses the Representer theorem and Euler approximations to provide a numerical solution.
arXiv Detail & Related papers (2022-06-30T11:59:40Z) - D-CIPHER: Discovery of Closed-form Partial Differential Equations [80.46395274587098]
We propose D-CIPHER, which is robust to measurement artifacts and can uncover a new and very general class of differential equations.
We further design a novel optimization procedure, CoLLie, to help D-CIPHER search through this class efficiently.
arXiv Detail & Related papers (2022-06-21T17:59:20Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Feature Engineering with Regularity Structures [4.082216579462797]
We investigate the use of models from the theory of regularity structures as features in machine learning tasks.
We provide a flexible definition of a model feature vector associated to a space-time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression.
We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data.
arXiv Detail & Related papers (2021-08-12T17:53:47Z) - Neural Controlled Differential Equations for Irregular Time Series [17.338923885534197]
An ordinary differential equation is determined by its initial condition, and there is no mechanism for adjusting the trajectory based on subsequent observations.
Here we demonstrate how this may be resolved through the well-understood mathematics of emphcontrolled differential equations
We show that our model achieves state-of-the-art performance against similar (ODE or RNN based) models in empirical studies on a range of datasets.
arXiv Detail & Related papers (2020-05-18T17:52:21Z) - The data-driven physical-based equations discovery using evolutionary
approach [77.34726150561087]
We describe the algorithm for the mathematical equations discovery from the given observations data.
The algorithm combines genetic programming with the sparse regression.
It could be used for governing analytical equation discovery as well as for partial differential equations (PDE) discovery.
arXiv Detail & Related papers (2020-04-03T17:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.