Sparse Gaussian Neural Processes
- URL: http://arxiv.org/abs/2504.01650v2
- Date: Thu, 24 Apr 2025 15:21:20 GMT
- Title: Sparse Gaussian Neural Processes
- Authors: Tommy Rochussen, Vincent Fortuin,
- Abstract summary: We introduce a family of models that meta-learn sparse Gaussian process inference.<n>This enables rapid prediction on new tasks with sparse Gaussian processes.<n>It also allows manual elicitation of priors in a neural process for the first time.
- Score: 7.050045034682338
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite significant recent advances in probabilistic meta-learning, it is common for practitioners to avoid using deep learning models due to a comparative lack of interpretability. Instead, many practitioners simply use non-meta-models such as Gaussian processes with interpretable priors, and conduct the tedious procedure of training their model from scratch for each task they encounter. While this is justifiable for tasks with a limited number of data points, the cubic computational cost of exact Gaussian process inference renders this prohibitive when each task has many observations. To remedy this, we introduce a family of models that meta-learn sparse Gaussian process inference. Not only does this enable rapid prediction on new tasks with sparse Gaussian processes, but since our models have clear interpretations as members of the neural process family, it also allows manual elicitation of priors in a neural process for the first time. In meta-learning regimes for which the number of observed tasks is small or for which expert domain knowledge is available, this offers a crucial advantage.
Related papers
- Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference [55.150117654242706]
We show that model selection for computation-aware GPs trained on 1.8 million data points can be done within a few hours on a single GPU.
As a result of this work, Gaussian processes can be trained on large-scale datasets without significantly compromising their ability to quantify uncertainty.
arXiv Detail & Related papers (2024-11-01T21:11:48Z) - Linearization Turns Neural Operators into Function-Valued Gaussian Processes [23.85470417458593]
We introduce LUNO, a novel framework for approximate Bayesian uncertainty quantification in trained neural operators.<n>Our approach leverages model linearization to push (Gaussian) weight-space uncertainty forward to the neural operator's predictions.<n>We show that this can be interpreted as a probabilistic version of the concept of currying from functional programming, yielding a function-valued (Gaussian) random process belief.
arXiv Detail & Related papers (2024-06-07T16:43:54Z) - Mixtures of Gaussian process experts based on kernel stick-breaking
processes [0.6396288020763143]
We propose a new mixture model of Gaussian process experts based on kernel stick-breaking processes.
Our model maintains the intuitive appeal yet improve the performance of the existing models.
The model behaviour and improved predictive performance are demonstrated in experiments using six datasets.
arXiv Detail & Related papers (2023-04-26T21:23:01Z) - MARS: Meta-Learning as Score Matching in the Function Space [79.73213540203389]
We present a novel approach to extracting inductive biases from a set of related datasets.
We use functional Bayesian neural network inference, which views the prior as a process and performs inference in the function space.
Our approach can seamlessly acquire and represent complex prior knowledge by metalearning the score function of the data-generating process.
arXiv Detail & Related papers (2022-10-24T15:14:26Z) - Can deep neural networks learn process model structure? An assessment
framework and analysis [0.2580765958706854]
We propose an evaluation scheme complemented with new fitness, precision, and generalisation metrics.
We apply this framework to several process models with simple control-flow behaviour.
Our results show that, even for such simplistic models, careful tuning of overfitting countermeasures is required.
arXiv Detail & Related papers (2022-02-24T09:44:13Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Bayesian Meta-Learning Through Variational Gaussian Processes [0.0]
We extend Gaussian-process-based meta-learning to allow for high-quality, arbitrary non-Gaussian uncertainty predictions.
Our method performs significantly better than existing Bayesian meta-learning baselines.
arXiv Detail & Related papers (2021-10-21T10:44:23Z) - Last Layer Marginal Likelihood for Invariance Learning [12.00078928875924]
We introduce a new lower bound to the marginal likelihood, which allows us to perform inference for a larger class of likelihood functions.
We work towards bringing this approach to neural networks by using an architecture with a Gaussian process in the last layer.
arXiv Detail & Related papers (2021-06-14T15:40:51Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.