Deep Stochastic Processes via Functional Markov Transition Operators
- URL: http://arxiv.org/abs/2305.15574v1
- Date: Wed, 24 May 2023 21:15:23 GMT
- Title: Deep Stochastic Processes via Functional Markov Transition Operators
- Authors: Jin Xu, Emilien Dupont, Kaspar M\"artens, Tom Rainforth, Yee Whye Teh
- Abstract summary: We introduce a new class of Processes (SPs) constructed by stacking sequences of neural parameterised Markov transition operators in function space.
We prove that these Markov transition operators can preserve the exchangeability and consistency of SPs.
- Score: 59.55961312230447
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Markov Neural Processes (MNPs), a new class of Stochastic
Processes (SPs) which are constructed by stacking sequences of neural
parameterised Markov transition operators in function space. We prove that
these Markov transition operators can preserve the exchangeability and
consistency of SPs. Therefore, the proposed iterative construction adds
substantial flexibility and expressivity to the original framework of Neural
Processes (NPs) without compromising consistency or adding restrictions. Our
experiments demonstrate clear advantages of MNPs over baseline models on a
variety of tasks.
Related papers
- Functional Stochastic Gradient MCMC for Bayesian Neural Networks [15.766590837199427]
We introduce novel functional MCMC schemes, including gradient versions, that can incorporate more informative priors.
Our schemes demonstrate improved performance in both predictive accuracy and uncertainty on several tasks.
arXiv Detail & Related papers (2024-09-25T05:23:01Z) - PseudoNeg-MAE: Self-Supervised Point Cloud Learning using Conditional Pseudo-Negative Embeddings [55.55445978692678]
PseudoNeg-MAE is a self-supervised learning framework that enhances global feature representation of point cloud mask autoencoders.
We show that PseudoNeg-MAE achieves state-of-the-art performance on the ModelNet40 and ScanObjectNN datasets.
arXiv Detail & Related papers (2024-09-24T07:57:21Z) - Spectral Convolutional Conditional Neural Processes [4.52069311861025]
Conditional Neural Processes (CNPs) constitute a family of probabilistic models that harness the flexibility of neural networks to parameterize processes.
We propose Spectral Convolutional Conditional Neural Processes (SConvCNPs), a new addition to the NPs family that allows for more efficient representation of functions in the frequency domain.
arXiv Detail & Related papers (2024-04-19T21:13:18Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Neural Functional Transformers [99.98750156515437]
This paper uses the attention mechanism to define a novel set of permutation equivariant weight-space layers called neural functional Transformers (NFTs)
NFTs respect weight-space permutation symmetries while incorporating the advantages of attention, which have exhibited remarkable success across multiple domains.
We also leverage NFTs to develop Inr2Array, a novel method for computing permutation invariant representations from the weights of implicit neural representations (INRs)
arXiv Detail & Related papers (2023-05-22T23:38:27Z) - GFlowNet-EM for learning compositional latent variable models [115.96660869630227]
A key tradeoff in modeling the posteriors over latents is between expressivity and tractable optimization.
We propose the use of GFlowNets, algorithms for sampling from an unnormalized density.
By training GFlowNets to sample from the posterior over latents, we take advantage of their strengths as amortized variational algorithms.
arXiv Detail & Related papers (2023-02-13T18:24:21Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Transformer Neural Processes: Uncertainty-Aware Meta Learning Via
Sequence Modeling [26.377099481072992]
We propose Transformer Neural Processes (TNPs) for uncertainty-aware meta learning.
We learn TNPs via an autoregressive likelihood-based objective and instantiate it with a novel transformer-based architecture.
We show that TNPs achieve state-of-the-art performance on various benchmark problems.
arXiv Detail & Related papers (2022-07-09T02:28:58Z) - Neural Diffusion Processes [12.744250155946503]
We propose Neural Diffusion Processes (NDPs), a novel approach that learns to sample from a rich distribution over functions through its finite marginals.
We empirically show that NDPs can capture functional distributions close to the true Bayesian posterior.
NDPs enable a variety of downstream tasks, including regression, implicit hyper marginalisation, non-Gaussian posterior prediction and global optimisation.
arXiv Detail & Related papers (2022-06-08T16:13:04Z) - Message Passing Neural Processes [3.0969191504482247]
We introduce Message Passing Neural Processes (MPNPs), which explicitly makes use of relational structure within the model.
MPNPs thrive at lower sampling rates, on existing benchmarks and newly-proposed CA and Cora-Branched tasks.
We report strong generalisation over density-based CA rulesets and significant gains in challenging arbitrary-labelling and few-shot learning setups.
arXiv Detail & Related papers (2020-09-29T09:40:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.