Adjoint-aided inference of Gaussian process driven differential
equations
- URL: http://arxiv.org/abs/2202.04589v1
- Date: Wed, 9 Feb 2022 17:35:14 GMT
- Title: Adjoint-aided inference of Gaussian process driven differential
equations
- Authors: Paterne Gahungu, Christopher W Lanyon, Mauricio A Alvarez, Engineer
Bainomugisha, Michael Smith, and Richard D. Wilkinson
- Abstract summary: We show how the adjoint of a linear system can be used to efficiently infer forcing functions modelled as GPs.
We demonstrate the approach on systems of both ordinary and partial differential equations.
- Score: 0.8257490175399691
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Linear systems occur throughout engineering and the sciences, most notably as
differential equations. In many cases the forcing function for the system is
unknown, and interest lies in using noisy observations of the system to infer
the forcing, as well as other unknown parameters. In differential equations,
the forcing function is an unknown function of the independent variables
(typically time and space), and can be modelled as a Gaussian process (GP). In
this paper we show how the adjoint of a linear system can be used to
efficiently infer forcing functions modelled as GPs, after using a truncated
basis expansion of the GP kernel. We show how exact conjugate Bayesian
inference for the truncated GP can be achieved, in many cases with
substantially lower computation than would be required using MCMC methods. We
demonstrate the approach on systems of both ordinary and partial differential
equations, and by testing on synthetic data, show that the basis expansion
approach approximates well the true forcing with a modest number of basis
vectors. Finally, we show how to infer point estimates for the non-linear model
parameters, such as the kernel length-scales, using Bayesian optimisation.
Related papers
- Amortized Variational Inference for Deep Gaussian Processes [0.0]
Deep Gaussian processes (DGPs) are multilayer generalizations of Gaussian processes (GPs)
We introduce amortized variational inference for DGPs, which learns an inference function that maps each observation to variational parameters.
Our method performs similarly or better than previous approaches at less computational cost.
arXiv Detail & Related papers (2024-09-18T20:23:27Z) - Gaussian Process Priors for Systems of Linear Partial Differential
Equations with Constant Coefficients [4.327763441385371]
Partial differential equations (PDEs) are important tools to model physical systems.
We propose a family of Gaussian process (GP) priors, which we call EPGP, such that all realizations are exact solutions of this system.
We demonstrate our approach on three families of systems of PDEs, the heat equation, wave equation, and Maxwell's equations.
arXiv Detail & Related papers (2022-12-29T14:28:32Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Constraining Gaussian Processes to Systems of Linear Ordinary
Differential Equations [5.33024001730262]
LODE-GPs follow a system of linear homogeneous ODEs with constant coefficients.
We show the effectiveness of LODE-GPs in a number of experiments.
arXiv Detail & Related papers (2022-08-26T09:16:53Z) - AutoIP: A United Framework to Integrate Physics into Gaussian Processes [15.108333340471034]
We propose a framework that can integrate all kinds of differential equations into Gaussian processes.
Our method shows improvement upon vanilla GPs in both simulation and several real-world applications.
arXiv Detail & Related papers (2022-02-24T19:02:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Learning Nonparametric Volterra Kernels with Gaussian Processes [0.0]
This paper introduces a method for the nonparametric Bayesian learning of nonlinear operators, through the use of the Volterra series with kernels represented using Gaussian processes (GPs)
When the input function to the operator is unobserved and has a GP prior, the NVKM constitutes a powerful method for both single and multiple output regression, and can be viewed as a nonlinear and nonparametric latent force model.
arXiv Detail & Related papers (2021-06-10T08:21:00Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.