Sequential Estimation of Gaussian Process-based Deep State-Space Models
- URL: http://arxiv.org/abs/2301.12528v2
- Date: Sat, 23 Mar 2024 22:57:18 GMT
- Title: Sequential Estimation of Gaussian Process-based Deep State-Space Models
- Authors: Yuhao Liu, Marzieh Ajirak, Petar Djuric,
- Abstract summary: We consider the problem of sequential estimation of the unknowns of state-space and deep state-space models.
We present a method based on particle filtering where the parameters of the random feature-based Gaussian processes are integrated out.
We show that the method can track the latent processes up to a scale and rotation.
- Score: 1.760402297380953
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We consider the problem of sequential estimation of the unknowns of state-space and deep state-space models that include estimation of functions and latent processes of the models. The proposed approach relies on Gaussian and deep Gaussian processes that are implemented via random feature-based Gaussian processes. In these models, we have two sets of unknowns, highly nonlinear unknowns (the values of the latent processes) and conditionally linear unknowns (the constant parameters of the random feature-based Gaussian processes). We present a method based on particle filtering where the parameters of the random feature-based Gaussian processes are integrated out in obtaining the predictive density of the states and do not need particles. We also propose an ensemble version of the method, with each member of the ensemble having its own set of features. With several experiments, we show that the method can track the latent processes up to a scale and rotation.
Related papers
- Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Uncovering Regions of Maximum Dissimilarity on Random Process Data [0.0]
This paper proposes a method that learns about regions with a certain volume, where the marginal attributes of two processes are less similar.
The proposed methods are devised in full generality for the setting where the data of interest are themselves processes.
We showcase their application with case studies on criminology, finance, and medicine.
arXiv Detail & Related papers (2022-09-12T19:44:49Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - A Kernel-Based Approach for Modelling Gaussian Processes with Functional
Information [0.0]
We use a Gaussian process model to unify the typical finite case with the case of uncountable information.
We discuss this construction in statistical models, including numerical considerations and a proof of concept.
arXiv Detail & Related papers (2022-01-26T15:58:08Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Doubly Sparse Variational Gaussian Processes [14.209730729425502]
We show that the inducing point framework is still valid for state space models and that it can bring further computational and memory savings.
This work makes it possible to use the state-space formulation inside deep Gaussian process models as illustrated in one of the experiments.
arXiv Detail & Related papers (2020-01-15T15:07:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.