Constraining Gaussian Processes to Systems of Linear Ordinary
Differential Equations
- URL: http://arxiv.org/abs/2208.12515v1
- Date: Fri, 26 Aug 2022 09:16:53 GMT
- Title: Constraining Gaussian Processes to Systems of Linear Ordinary
Differential Equations
- Authors: Andreas Besginow, Markus Lange-Hegermann
- Abstract summary: LODE-GPs follow a system of linear homogeneous ODEs with constant coefficients.
We show the effectiveness of LODE-GPs in a number of experiments.
- Score: 5.33024001730262
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data in many applications follows systems of Ordinary Differential Equations
(ODEs). This paper presents a novel algorithmic and symbolic construction for
covariance functions of Gaussian Processes (GPs) with realizations strictly
following a system of linear homogeneous ODEs with constant coefficients, which
we call LODE-GPs. Introducing this strong inductive bias into a GP improves
modelling of such data. Using smith normal form algorithms, a symbolic
technique, we overcome two current restrictions in the state of the art: (1)
the need for certain uniqueness conditions in the set of solutions, typically
assumed in classical ODE solvers and their probabilistic counterparts, and (2)
the restriction to controllable systems, typically assumed when encoding
differential equations in covariance functions. We show the effectiveness of
LODE-GPs in a number of experiments, for example learning physically
interpretable parameters by maximizing the likelihood.
Related papers
- Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - Discovering ordinary differential equations that govern time-series [65.07437364102931]
We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2022-11-05T07:07:58Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Learning nonparametric ordinary differential equations from noisy data [0.10555513406636088]
Learning nonparametric systems of Ordinary Differential Equations (ODEs) dot x = f(t,x) from noisy data is an emerging machine learning topic.
We use the theory of Reproducing Kernel Hilbert Spaces (RKHS) to define candidates for f for which the solution of the ODE exists and is unique.
We propose a penalty method that iteratively uses the Representer theorem and Euler approximations to provide a numerical solution.
arXiv Detail & Related papers (2022-06-30T11:59:40Z) - Adjoint-aided inference of Gaussian process driven differential
equations [0.8257490175399691]
We show how the adjoint of a linear system can be used to efficiently infer forcing functions modelled as GPs.
We demonstrate the approach on systems of both ordinary and partial differential equations.
arXiv Detail & Related papers (2022-02-09T17:35:14Z) - Feature Engineering with Regularity Structures [4.082216579462797]
We investigate the use of models from the theory of regularity structures as features in machine learning tasks.
We provide a flexible definition of a model feature vector associated to a space-time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression.
We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data.
arXiv Detail & Related papers (2021-08-12T17:53:47Z) - Solving Differential Equations via Continuous-Variable Quantum Computers [0.0]
We explore how a continuous-dimensional (CV) quantum computer could solve a classic differential equation, making use of its innate capability to represent real numbers in qumodes.
Our simulations and parameter optimization using the PennyLane / Strawberry Fields framework demonstrate good both linear and non-linear ODEs.
arXiv Detail & Related papers (2020-12-22T18:06:12Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.