Practical Equivariances via Relational Conditional Neural Processes
- URL: http://arxiv.org/abs/2306.10915v2
- Date: Sun, 5 Nov 2023 13:33:00 GMT
- Title: Practical Equivariances via Relational Conditional Neural Processes
- Authors: Daolang Huang, Manuel Haussmann, Ulpu Remes, ST John, Gr\'egoire
Clart\'e, Kevin Sebastian Luck, Samuel Kaski, Luigi Acerbi
- Abstract summary: Conditional Neural Processes (CNPs) are a class of Conditional meta models popular for combining efficiency of amortized quantification with reliable uncertainty.
We propose Conditional Neural Processes (RCNPs) as an effective approach to incorporate equivariances into any neural process model.
Our proposed method extends the applicability and impact of equivariant processes to higher dimensions.
- Score: 20.192899181958264
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conditional Neural Processes (CNPs) are a class of metalearning models
popular for combining the runtime efficiency of amortized inference with
reliable uncertainty quantification. Many relevant machine learning tasks, such
as in spatio-temporal modeling, Bayesian Optimization and continuous control,
inherently contain equivariances -- for example to translation -- which the
model can exploit for maximal performance. However, prior attempts to include
equivariances in CNPs do not scale effectively beyond two input dimensions. In
this work, we propose Relational Conditional Neural Processes (RCNPs), an
effective approach to incorporate equivariances into any neural process model.
Our proposed method extends the applicability and impact of equivariant neural
processes to higher dimensions. We empirically demonstrate the competitive
performance of RCNPs on a large array of tasks naturally containing
equivariances.
Related papers
- Switchable Decision: Dynamic Neural Generation Networks [98.61113699324429]
We propose a switchable decision to accelerate inference by dynamically assigning resources for each data instance.
Our method benefits from less cost during inference while keeping the same accuracy.
arXiv Detail & Related papers (2024-05-07T17:44:54Z) - Spectral Convolutional Conditional Neural Processes [4.52069311861025]
Conditional Neural Processes (CNPs) constitute a family of probabilistic models that harness the flexibility of neural networks to parameterize processes.
We propose Spectral Convolutional Conditional Neural Processes (SConvCNPs), a new addition to the NPs family that allows for more efficient representation of functions in the frequency domain.
arXiv Detail & Related papers (2024-04-19T21:13:18Z) - kNN Algorithm for Conditional Mean and Variance Estimation with
Automated Uncertainty Quantification and Variable Selection [8.429136647141487]
We introduce a kNN-based regression method that synergizes the scalability and adaptability of traditional non-parametric kNN models.
This method focuses on accurately estimating the conditional mean and variance of random response variables.
It is particularly notable in biomedical applications as demonstrated in two case studies.
arXiv Detail & Related papers (2024-02-02T18:54:18Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - MoEfication: Conditional Computation of Transformer Models for Efficient
Inference [66.56994436947441]
Transformer-based pre-trained language models can achieve superior performance on most NLP tasks due to large parameter capacity, but also lead to huge computation cost.
We explore to accelerate large-model inference by conditional computation based on the sparse activation phenomenon.
We propose to transform a large model into its mixture-of-experts (MoE) version with equal model size, namely MoEfication.
arXiv Detail & Related papers (2021-10-05T02:14:38Z) - Efficient Gaussian Neural Processes for Regression [7.149677544861951]
Conditional Neural Processes (CNPs) produce well-calibrated predictions, enable fast inference at test time, and are trainable via a simple maximum likelihood procedure.
A limitation of CNPs is their inability to model dependencies in the outputs.
We present an alternative way to model output dependencies which also lends itself maximum likelihood training.
arXiv Detail & Related papers (2021-08-22T09:31:50Z) - Collaborative Nonstationary Multivariate Gaussian Process Model [2.362467745272567]
We propose a novel model called the collaborative nonstationary Gaussian process model(CNMGP)
CNMGP allows us to model data in which outputs do not share a common input set, with a computational complexity independent of the size of the inputs and outputs.
We show that our model generally pro-vides better predictive performance than the state-of-the-art, and also provides estimates of time-varying correlations that differ across outputs.
arXiv Detail & Related papers (2021-06-01T18:25:22Z) - Meta-Learning Stationary Stochastic Process Prediction with
Convolutional Neural Processes [32.02612871707347]
We propose ConvNP, which endows Neural Processes (NPs) with translation equivariance and extends convolutional conditional NPs to allow for dependencies in the predictive distribution.
We demonstrate the strong performance and generalization capabilities of ConvNPs on 1D, regression image completion, and various tasks with real-world-temporal data.
arXiv Detail & Related papers (2020-07-02T18:25:27Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.