Message Passing Neural Processes
- URL: http://arxiv.org/abs/2009.13895v1
- Date: Tue, 29 Sep 2020 09:40:09 GMT
- Title: Message Passing Neural Processes
- Authors: Ben Day, C\u{a}t\u{a}lina Cangea, Arian R. Jamasb, Pietro Li\`o
- Abstract summary: We introduce Message Passing Neural Processes (MPNPs), which explicitly makes use of relational structure within the model.
MPNPs thrive at lower sampling rates, on existing benchmarks and newly-proposed CA and Cora-Branched tasks.
We report strong generalisation over density-based CA rulesets and significant gains in challenging arbitrary-labelling and few-shot learning setups.
- Score: 3.0969191504482247
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Processes (NPs) are powerful and flexible models able to incorporate
uncertainty when representing stochastic processes, while maintaining a linear
time complexity. However, NPs produce a latent description by aggregating
independent representations of context points and lack the ability to exploit
relational information present in many datasets. This renders NPs ineffective
in settings where the stochastic process is primarily governed by neighbourhood
rules, such as cellular automata (CA), and limits performance for any task
where relational information remains unused. We address this shortcoming by
introducing Message Passing Neural Processes (MPNPs), the first class of NPs
that explicitly makes use of relational structure within the model. Our
evaluation shows that MPNPs thrive at lower sampling rates, on existing
benchmarks and newly-proposed CA and Cora-Branched tasks. We further report
strong generalisation over density-based CA rule-sets and significant gains in
challenging arbitrary-labelling and few-shot learning setups.
Related papers
- In-Context In-Context Learning with Transformer Neural Processes [50.57807892496024]
We develop the in-context in-context learning pseudo-token TNP (ICICL-TNP)
The ICICL-TNP is capable of conditioning on both sets of datapoints and sets of datasets, enabling it to perform in-context in-context learning.
We demonstrate the importance of in-context in-context learning and the effectiveness of the ICICL-TNP in a number of experiments.
arXiv Detail & Related papers (2024-06-19T12:26:36Z) - Switchable Decision: Dynamic Neural Generation Networks [98.61113699324429]
We propose a switchable decision to accelerate inference by dynamically assigning resources for each data instance.
Our method benefits from less cost during inference while keeping the same accuracy.
arXiv Detail & Related papers (2024-05-07T17:44:54Z) - Spectral Convolutional Conditional Neural Processes [4.52069311861025]
Conditional Neural Processes (CNPs) constitute a family of probabilistic models that harness the flexibility of neural networks to parameterize processes.
We propose Spectral Convolutional Conditional Neural Processes (SConvCNPs), a new addition to the NPs family that allows for more efficient representation of functions in the frequency domain.
arXiv Detail & Related papers (2024-04-19T21:13:18Z) - Deep Stochastic Processes via Functional Markov Transition Operators [59.55961312230447]
We introduce a new class of Processes (SPs) constructed by stacking sequences of neural parameterised Markov transition operators in function space.
We prove that these Markov transition operators can preserve the exchangeability and consistency of SPs.
arXiv Detail & Related papers (2023-05-24T21:15:23Z) - Versatile Neural Processes for Learning Implicit Neural Representations [57.090658265140384]
We propose Versatile Neural Processes (VNP), which largely increases the capability of approximating functions.
Specifically, we introduce a bottleneck encoder that produces fewer and informative context tokens, relieving the high computational cost.
We demonstrate the effectiveness of the proposed VNP on a variety of tasks involving 1D, 2D and 3D signals.
arXiv Detail & Related papers (2023-01-21T04:08:46Z) - Evidential Conditional Neural Processes [7.257751371276488]
Conditional Neural Process (CNP) models offer a promising direction to tackle few-shot problems.
Current CNP models only capture the overall uncertainty for the prediction made on a target data point.
We propose Evidential Conditional Neural Processes (ECNP), which replace the standard Gaussian distribution used by CNP.
arXiv Detail & Related papers (2022-11-30T21:50:55Z) - Latent Bottlenecked Attentive Neural Processes [71.18817592128207]
We present Latent Bottlenecked Attentive Neural Processes (LBANPs)
LBANPs have a querying computational complexity independent of the number of context datapoints.
We show LBANPs achieve results competitive with the state-of-the-art on meta-regression, image completion, and contextual multi-armed bandits.
arXiv Detail & Related papers (2022-11-15T19:21:41Z) - Neural Processes with Stochastic Attention: Paying more attention to the
context dataset [11.301294319986477]
Neural processes (NPs) aim to complete unseen data points based on a given context dataset.
We propose a attention mechanism for NPs to capture appropriate context information.
We empirically show that our approach substantially outperforms conventional NPs in various domains.
arXiv Detail & Related papers (2022-04-11T23:57:19Z) - Bootstrapping Neural Processes [114.97111530885093]
Neural Processes (NPs) implicitly define a broad class of processes with neural networks.
NPs still rely on an assumption that uncertainty in processes is modeled by a single latent variable.
We propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap.
arXiv Detail & Related papers (2020-08-07T02:23:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.