MAGMA: Inference and Prediction with Multi-Task Gaussian Processes
- URL: http://arxiv.org/abs/2007.10731v2
- Date: Tue, 24 May 2022 15:13:10 GMT
- Title: MAGMA: Inference and Prediction with Multi-Task Gaussian Processes
- Authors: Arthur Leroy and Pierre Latouche and Benjamin Guedj and Servane Gey
- Abstract summary: A novel multi-task Gaussian process (GP) framework is proposed, by using a common mean process for sharing information across tasks.
Our overall algorithm is called textscMagma (standing for Multi tAsk Gaussian processes with common MeAn)
- Score: 4.368185344922342
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: A novel multi-task Gaussian process (GP) framework is proposed, by using a
common mean process for sharing information across tasks. In particular, we
investigate the problem of time series forecasting, with the objective to
improve multiple-step-ahead predictions. The common mean process is defined as
a GP for which the hyper-posterior distribution is tractable. Therefore an EM
algorithm is derived for handling both hyper-parameters optimisation and
hyper-posterior computation. Unlike previous approaches in the literature, the
model fully accounts for uncertainty and can handle irregular grids of
observations while maintaining explicit formulations, by modelling the mean
process in a unified GP framework. Predictive analytical equations are
provided, integrating information shared across tasks through a relevant prior
mean. This approach greatly improves the predictive performances, even far from
observations, and may reduce significantly the computational complexity
compared to traditional multi-task GP models. Our overall algorithm is called
\textsc{Magma} (standing for Multi tAsk Gaussian processes with common MeAn).
The quality of the mean process estimation, predictive performances, and
comparisons to alternatives are assessed in various simulated scenarios and on
real datasets.
Related papers
- It's All in the Mix: Wasserstein Machine Learning with Mixed Features [5.739657897440173]
We present a practically efficient algorithm to solve mixed-feature problems.
We demonstrate that our approach can significantly outperform existing methods that are to the presence of discrete features.
arXiv Detail & Related papers (2023-12-19T15:15:52Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Gaussian Processes to speed up MCMC with automatic
exploratory-exploitation effect [1.0742675209112622]
We present a two-stage Metropolis-Hastings algorithm for sampling probabilistic models.
The key feature of the approach is the ability to learn the target distribution from scratch while sampling.
arXiv Detail & Related papers (2021-09-28T17:43:25Z) - On MCMC for variationally sparse Gaussian processes: A pseudo-marginal
approach [0.76146285961466]
Gaussian processes (GPs) are frequently used in machine learning and statistics to construct powerful models.
We propose a pseudo-marginal (PM) scheme that offers exact inference as well as computational gains through doubly estimators for the likelihood and large datasets.
arXiv Detail & Related papers (2021-03-04T20:48:29Z) - Cluster-Specific Predictions with Multi-Task Gaussian Processes [4.368185344922342]
A model involving Gaussian processes (GPs) is introduced to handle multi-task learning, clustering, and prediction.
The model is instantiated as a mixture of multi-task GPs with common mean processes.
The overall algorithm, called MagmaClust, is publicly available as an R package.
arXiv Detail & Related papers (2020-11-16T11:08:59Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Real-Time Regression with Dividing Local Gaussian Processes [62.01822866877782]
Local Gaussian processes are a novel, computationally efficient modeling approach based on Gaussian process regression.
Due to an iterative, data-driven division of the input space, they achieve a sublinear computational complexity in the total number of training points in practice.
A numerical evaluation on real-world data sets shows their advantages over other state-of-the-art methods in terms of accuracy as well as prediction and update speed.
arXiv Detail & Related papers (2020-06-16T18:43:31Z) - Beyond the Mean-Field: Structured Deep Gaussian Processes Improve the
Predictive Uncertainties [12.068153197381575]
We propose a novel variational family that allows for retaining covariances between latent processes while achieving fast convergence.
We provide an efficient implementation of our new approach and apply it to several benchmark datasets.
It yields excellent results and strikes a better balance between accuracy and calibrated uncertainty estimates than its state-of-the-art alternatives.
arXiv Detail & Related papers (2020-05-22T11:10:59Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.