GPflux: A Library for Deep Gaussian Processes
- URL: http://arxiv.org/abs/2104.05674v1
- Date: Mon, 12 Apr 2021 17:41:18 GMT
- Title: GPflux: A Library for Deep Gaussian Processes
- Authors: Vincent Dutordoir, Hugh Salimbeni, Eric Hambro, John McLeod, Felix
Leibfried, Artem Artemev, Mark van der Wilk, James Hensman, Marc P.
Deisenroth, ST John
- Abstract summary: GPflux is a Python library for Bayesian deep learning with a strong emphasis on deep Gaussian processes (DGPs)
It is compatible with and built on top of the Keras deep learning eco-system.
GPflux relies on GPflow for most of its GP objects and operations, which makes it an efficient, modular and extendable library.
- Score: 31.207566616050574
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce GPflux, a Python library for Bayesian deep learning with a
strong emphasis on deep Gaussian processes (DGPs). Implementing DGPs is a
challenging endeavour due to the various mathematical subtleties that arise
when dealing with multivariate Gaussian distributions and the complex
bookkeeping of indices. To date, there are no actively maintained, open-sourced
and extendable libraries available that support research activities in this
area. GPflux aims to fill this gap by providing a library with state-of-the-art
DGP algorithms, as well as building blocks for implementing novel Bayesian and
GP-based hierarchical models and inference schemes. GPflux is compatible with
and built on top of the Keras deep learning eco-system. This enables
practitioners to leverage tools from the deep learning community for building
and training customised Bayesian models, and create hierarchical models that
consist of Bayesian and standard neural network layers in a single coherent
framework. GPflux relies on GPflow for most of its GP objects and operations,
which makes it an efficient, modular and extensible library, while having a
lean codebase.
Related papers
- GPTreeO: An R package for continual regression with dividing local Gaussian processes [0.0]
We introduce GPTreeO, a flexible R package for scalable Gaussian process (GP) regression.
GPTreeO builds upon the Dividing Local Gaussian Processes (DLGP) algorithm, in which a binary tree of local GP regressors is dynamically constructed.
We conduct a sensitivity analysis to show how GPTreeO's features impact the regression performance in a continual learning setting.
arXiv Detail & Related papers (2024-10-01T19:33:39Z) - GP+: A Python Library for Kernel-based learning via Gaussian Processes [0.0]
We introduce GP+, an open-source library for kernel-based learning via Gaussian processes (GPs)
GP+ is built on PyTorch and provides a user-friendly and object-oriented tool for probabilistic learning and inference.
arXiv Detail & Related papers (2023-12-12T19:39:40Z) - Weighted Ensembles for Active Learning with Adaptivity [60.84896785303314]
This paper presents an ensemble of GP models with weights adapted to the labeled data collected incrementally.
Building on this novel EGP model, a suite of acquisition functions emerges based on the uncertainty and disagreement rules.
An adaptively weighted ensemble of EGP-based acquisition functions is also introduced to further robustify performance.
arXiv Detail & Related papers (2022-06-10T11:48:49Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - GP-Tree: A Gaussian Process Classifier for Few-Shot Incremental Learning [23.83961717568121]
GP-Tree is a novel method for multi-class classification with Gaussian processes and deep kernel learning.
We develop a tree-based hierarchical model in which each internal node fits a GP to the data.
Our method scales well with both the number of classes and data size.
arXiv Detail & Related papers (2021-02-15T22:16:27Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z) - A Framework for Interdomain and Multioutput Gaussian Processes [22.62911488724047]
We present a mathematical and software framework for scalable approximate inference in GPs.
Our framework, implemented in GPflow, provides a unified interface for many existing multioutput models.
arXiv Detail & Related papers (2020-03-02T16:24:59Z) - MOGPTK: The Multi-Output Gaussian Process Toolkit [71.08576457371433]
We present MOGPTK, a Python package for multi-channel data modelling using Gaussian processes (GP)
The aim of this toolkit is to make multi-output GP (MOGP) models accessible to researchers, data scientists, and practitioners alike.
arXiv Detail & Related papers (2020-02-09T23:34:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.