Sparse Algorithms for Markovian Gaussian Processes
- URL: http://arxiv.org/abs/2103.10710v1
- Date: Fri, 19 Mar 2021 09:50:53 GMT
- Title: Sparse Algorithms for Markovian Gaussian Processes
- Authors: William J. Wilkinson, Arno Solin, Vincent Adam
- Abstract summary: Sparse Markovian processes combine the use of inducing variables with efficient Kalman filter-likes recursion.
We derive a general site-based approach to approximate the non-Gaussian likelihood with local Gaussian terms, called sites.
Our approach results in a suite of novel sparse extensions to algorithms from both the machine learning and signal processing, including variational inference, expectation propagation, and the classical nonlinear Kalman smoothers.
The derived methods are suited to literature-temporal data, where the model has separate inducing points in both time and space.
- Score: 18.999495374836584
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Approximate Bayesian inference methods that scale to very large datasets are
crucial in leveraging probabilistic models for real-world time series. Sparse
Markovian Gaussian processes combine the use of inducing variables with
efficient Kalman filter-like recursions, resulting in algorithms whose
computational and memory requirements scale linearly in the number of inducing
points, whilst also enabling parallel parameter updates and stochastic
optimisation. Under this paradigm, we derive a general site-based approach to
approximate inference, whereby we approximate the non-Gaussian likelihood with
local Gaussian terms, called sites. Our approach results in a suite of novel
sparse extensions to algorithms from both the machine learning and signal
processing literature, including variational inference, expectation
propagation, and the classical nonlinear Kalman smoothers. The derived methods
are suited to large time series, and we also demonstrate their applicability to
spatio-temporal data, where the model has separate inducing points in both time
and space.
Related papers
- Iterative Methods for Vecchia-Laplace Approximations for Latent Gaussian Process Models [11.141688859736805]
We introduce and analyze several preconditioners, derive new convergence results, and propose novel methods for accurately approxing predictive variances.
In particular, we obtain a speed-up of an order of magnitude compared to Cholesky-based calculations.
All methods are implemented in a free C++ software library with high-level Python and R packages.
arXiv Detail & Related papers (2023-10-18T14:31:16Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Numerically Stable Sparse Gaussian Processes via Minimum Separation
using Cover Trees [57.67528738886731]
We study the numerical stability of scalable sparse approximations based on inducing points.
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
arXiv Detail & Related papers (2022-10-14T15:20:17Z) - Amortised inference of fractional Brownian motion with linear
computational complexity [0.0]
We introduce a simulation-based, amortised Bayesian inference scheme to infer the parameters of random walks.
Our approach learns the posterior distribution of the walks' parameters with a likelihood-free method.
We adapt this scheme to show that a finite decorrelation time in the environment can furthermore be inferred from individual trajectories.
arXiv Detail & Related papers (2022-03-15T14:43:16Z) - Spatio-Temporal Variational Gaussian Processes [26.60276485130467]
We introduce a scalable approach to Gaussian process inference that combinestemporal-temporal filtering with natural variational inference.
We derive a sparse approximation that constructs a state-space model over a reduced set of inducing points.
We show that for separable Markov kernels the full sparse cases recover exactly the standard variational GP.
arXiv Detail & Related papers (2021-11-02T16:53:31Z) - Variational Inference for Continuous-Time Switching Dynamical Systems [29.984955043675157]
We present a model based on an Markov jump process modulating a subordinated diffusion process.
We develop a new continuous-time variational inference algorithm.
We extensively evaluate our algorithm under the model assumption and for real-world examples.
arXiv Detail & Related papers (2021-09-29T15:19:51Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Fast Variational Learning in State-Space Gaussian Process Models [29.630197272150003]
We build upon an existing method called conjugate-computation variational inference.
We provide an efficient JAX implementation which exploits just-in-time compilation.
Our approach leads to fast and stable variational inference in state-space GP models that can be scaled to time series with millions of data points.
arXiv Detail & Related papers (2020-07-09T12:06:34Z) - Real-Time Regression with Dividing Local Gaussian Processes [62.01822866877782]
Local Gaussian processes are a novel, computationally efficient modeling approach based on Gaussian process regression.
Due to an iterative, data-driven division of the input space, they achieve a sublinear computational complexity in the total number of training points in practice.
A numerical evaluation on real-world data sets shows their advantages over other state-of-the-art methods in terms of accuracy as well as prediction and update speed.
arXiv Detail & Related papers (2020-06-16T18:43:31Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.