Practical Conditional Neural Processes Via Tractable Dependent
Predictions
- URL: http://arxiv.org/abs/2203.08775v1
- Date: Wed, 16 Mar 2022 17:37:41 GMT
- Title: Practical Conditional Neural Processes Via Tractable Dependent
Predictions
- Authors: Stratis Markou and James Requeima and Wessel P. Bruinsma and Anna
Vaughan and Richard E. Turner
- Abstract summary: Conditional Neural Processes (CNPs) are meta-learning models which leverage the flexibility of deep learning to produce well-calibrated predictions.
CNPs do not produce correlated predictions, making them inappropriate for many estimation and decision making tasks.
We present a new class of Neural Process models that make correlated predictions and support exact maximum likelihood training.
- Score: 25.15531845287349
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conditional Neural Processes (CNPs; Garnelo et al., 2018a) are meta-learning
models which leverage the flexibility of deep learning to produce
well-calibrated predictions and naturally handle off-the-grid and missing data.
CNPs scale to large datasets and train with ease. Due to these features, CNPs
appear well-suited to tasks from environmental sciences or healthcare.
Unfortunately, CNPs do not produce correlated predictions, making them
fundamentally inappropriate for many estimation and decision making tasks.
Predicting heat waves or floods, for example, requires modelling dependencies
in temperature or precipitation over time and space. Existing approaches which
model output dependencies, such as Neural Processes (NPs; Garnelo et al.,
2018b) or the FullConvGNP (Bruinsma et al., 2021), are either complicated to
train or prohibitively expensive. What is needed is an approach which provides
dependent predictions, but is simple to train and computationally tractable. In
this work, we present a new class of Neural Process models that make correlated
predictions and support exact maximum likelihood training that is simple and
scalable. We extend the proposed models by using invertible output
transformations, to capture non-Gaussian output distributions. Our models can
be used in downstream estimation tasks which require dependent function
samples. By accounting for output dependencies, our models show improved
predictive performance on a range of experiments with synthetic and real data.
Related papers
- Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference [55.150117654242706]
We show that model selection for computation-aware GPs trained on 1.8 million data points can be done within a few hours on a single GPU.
As a result of this work, Gaussian processes can be trained on large-scale datasets without significantly compromising their ability to quantify uncertainty.
arXiv Detail & Related papers (2024-11-01T21:11:48Z) - Convolutional Conditional Neural Processes [6.532867867011488]
This thesis advances neural processes in three ways.
ConvNPs improve data efficiency by building in a symmetry called translationvariance.
GNPs directly parametrise dependencies in the predictions of a neural process.
AR CNPs train a neural process without any modifications to the model or training procedure and, at test time, roll out the model in an autoregressive fashion.
arXiv Detail & Related papers (2024-08-18T19:53:38Z) - Autoregressive Conditional Neural Processes [20.587835119831595]
Conditional neural processes (CNPs) are attractive meta-learning models.
They produce well-calibrated predictions and are trainable via a simple maximum likelihood procedure.
CNPs are unable to model dependencies in their predictions.
We propose to change how CNPs are deployed at test time, without any modifications to the model or training procedure.
arXiv Detail & Related papers (2023-03-25T13:34:12Z) - Latent Bottlenecked Attentive Neural Processes [71.18817592128207]
We present Latent Bottlenecked Attentive Neural Processes (LBANPs)
LBANPs have a querying computational complexity independent of the number of context datapoints.
We show LBANPs achieve results competitive with the state-of-the-art on meta-regression, image completion, and contextual multi-armed bandits.
arXiv Detail & Related papers (2022-11-15T19:21:41Z) - Conditional Neural Processes for Molecules [0.0]
Neural processes (NPs) are models for transfer learning with properties reminiscent of Gaussian Processes (GPs)
This paper applies the conditional neural process (CNP) to DOCKSTRING, a dataset of docking scores for benchmarking ML models.
CNPs show competitive performance in few-shot learning tasks relative to supervised learning baselines common in QSAR modelling, as well as an alternative model for transfer learning based on pre-training and refining neural network regressors.
arXiv Detail & Related papers (2022-10-17T16:10:12Z) - Efficient Gaussian Neural Processes for Regression [7.149677544861951]
Conditional Neural Processes (CNPs) produce well-calibrated predictions, enable fast inference at test time, and are trainable via a simple maximum likelihood procedure.
A limitation of CNPs is their inability to model dependencies in the outputs.
We present an alternative way to model output dependencies which also lends itself maximum likelihood training.
arXiv Detail & Related papers (2021-08-22T09:31:50Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - The Neural Coding Framework for Learning Generative Models [91.0357317238509]
We propose a novel neural generative model inspired by the theory of predictive processing in the brain.
In a similar way, artificial neurons in our generative model predict what neighboring neurons will do, and adjust their parameters based on how well the predictions matched reality.
arXiv Detail & Related papers (2020-12-07T01:20:38Z) - Bootstrapping Neural Processes [114.97111530885093]
Neural Processes (NPs) implicitly define a broad class of processes with neural networks.
NPs still rely on an assumption that uncertainty in processes is modeled by a single latent variable.
We propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap.
arXiv Detail & Related papers (2020-08-07T02:23:34Z) - Meta-Learning Stationary Stochastic Process Prediction with
Convolutional Neural Processes [32.02612871707347]
We propose ConvNP, which endows Neural Processes (NPs) with translation equivariance and extends convolutional conditional NPs to allow for dependencies in the predictive distribution.
We demonstrate the strong performance and generalization capabilities of ConvNPs on 1D, regression image completion, and various tasks with real-world-temporal data.
arXiv Detail & Related papers (2020-07-02T18:25:27Z) - Parameter Space Factorization for Zero-Shot Learning across Tasks and
Languages [112.65994041398481]
We propose a Bayesian generative model for the space of neural parameters.
We infer the posteriors over such latent variables based on data from seen task-language combinations.
Our model yields comparable or better results than state-of-the-art, zero-shot cross-lingual transfer methods.
arXiv Detail & Related papers (2020-01-30T16:58:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.