The Neural Process Family: Survey, Applications and Perspectives
- URL: http://arxiv.org/abs/2209.00517v3
- Date: Mon, 2 Oct 2023 23:43:33 GMT
- Title: The Neural Process Family: Survey, Applications and Perspectives
- Authors: Saurav Jha, Dong Gong, Xuesong Wang, Richard E. Turner, Lina Yao
- Abstract summary: The Neural Processes Family (NPF) intends to offer the best of both worlds by leveraging neural networks for meta-learning predictive uncertainties.
We shed light on their potential to bring several recent advances in other deep learning domains under one umbrella.
- Score: 40.800706337651526
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The standard approaches to neural network implementation yield powerful
function approximation capabilities but are limited in their abilities to learn
meta representations and reason probabilistic uncertainties in their
predictions. Gaussian processes, on the other hand, adopt the Bayesian learning
scheme to estimate such uncertainties but are constrained by their efficiency
and approximation capacity. The Neural Processes Family (NPF) intends to offer
the best of both worlds by leveraging neural networks for meta-learning
predictive uncertainties. Such potential has brought substantial research
activity to the family in recent years. Therefore, a comprehensive survey of
NPF models is needed to organize and relate their motivation, methodology, and
experiments. This paper intends to address this gap while digging deeper into
the formulation, research themes, and applications concerning the family
members. We shed light on their potential to bring several recent advances in
other deep learning domains under one umbrella. We then provide a rigorous
taxonomy of the family and empirically demonstrate their capabilities for
modeling data generating functions operating on 1-d, 2-d, and 3-d input
domains. We conclude by discussing our perspectives on the promising directions
that can fuel the research advances in the field. Code for our experiments will
be made available at https://github.com/srvCodes/neural-processes-survey.
Related papers
- Machine Learning for Identifying Potential Participants in Uruguayan Social Programs [0.0]
This research project explores the optimization of the family selection process for participation in Uruguay's Crece Contigo Family Support Program (PAF) through machine learning.
An anonymized database of 15,436 previous referral cases was analyzed, focusing on pregnant women and children under four years of age.
The main objective was to develop a predictive algorithm capable of determining whether a family meets the conditions for acceptance into the program.
arXiv Detail & Related papers (2025-03-31T15:30:36Z) - Deep Learning Through A Telescoping Lens: A Simple Model Provides Empirical Insights On Grokking, Gradient Boosting & Beyond [61.18736646013446]
In pursuit of a deeper understanding of its surprising behaviors, we investigate the utility of a simple yet accurate model of a trained neural network.
Across three case studies, we illustrate how it can be applied to derive new empirical insights on a diverse range of prominent phenomena.
arXiv Detail & Related papers (2024-10-31T22:54:34Z) - Deep Learning and genetic algorithms for cosmological Bayesian inference speed-up [0.0]
We present a novel approach to accelerate the Bayesian inference process, focusing specifically on the nested sampling algorithms.
Our proposed method utilizes the power of deep learning, employing feedforward neural networks to approximate the likelihood function dynamically during the Bayesian inference process.
The implementation integrates with nested sampling algorithms and has been thoroughly evaluated using both simple cosmological dark energy models and diverse observational datasets.
arXiv Detail & Related papers (2024-05-06T09:14:58Z) - Amortised Inference in Bayesian Neural Networks [0.0]
We introduce the Amortised Pseudo-Observation Variational Inference Bayesian Neural Network (APOVI-BNN)
We show that the amortised inference is of similar or better quality to those obtained through traditional variational inference.
We then discuss how the APOVI-BNN may be viewed as a new member of the neural process family.
arXiv Detail & Related papers (2023-09-06T14:02:33Z) - Uncertainty in Natural Language Processing: Sources, Quantification, and
Applications [56.130945359053776]
We provide a comprehensive review of uncertainty-relevant works in the NLP field.
We first categorize the sources of uncertainty in natural language into three types, including input, system, and output.
We discuss the challenges of uncertainty estimation in NLP and discuss potential future directions.
arXiv Detail & Related papers (2023-06-05T06:46:53Z) - Reliable extrapolation of deep neural operators informed by physics or
sparse observations [2.887258133992338]
Deep neural operators can learn nonlinear mappings between infinite-dimensional function spaces via deep neural networks.
DeepONets provide a new simulation paradigm in science and engineering.
We propose five reliable learning methods that guarantee a safe prediction under extrapolation.
arXiv Detail & Related papers (2022-12-13T03:02:46Z) - MARS: Meta-Learning as Score Matching in the Function Space [79.73213540203389]
We present a novel approach to extracting inductive biases from a set of related datasets.
We use functional Bayesian neural network inference, which views the prior as a process and performs inference in the function space.
Our approach can seamlessly acquire and represent complex prior knowledge by metalearning the score function of the data-generating process.
arXiv Detail & Related papers (2022-10-24T15:14:26Z) - Rethinking Bayesian Learning for Data Analysis: The Art of Prior and
Inference in Sparsity-Aware Modeling [20.296566563098057]
Sparse modeling for signal processing and machine learning has been at the focus of scientific research for over two decades.
This article reviews some recent advances in incorporating sparsity-promoting priors into three popular data modeling tools.
arXiv Detail & Related papers (2022-05-28T00:43:52Z) - Approximating Attributed Incentive Salience In Large Scale Scenarios. A
Representation Learning Approach Based on Artificial Neural Networks [5.065947993017158]
We propose a methodology based on artificial neural networks (ANNs) for approximating latent states produced by incentive salience attribution.
We designed an ANN for estimating duration and intensity of future interactions between individuals and a series of video games in a large-scale longitudinal dataset.
arXiv Detail & Related papers (2021-08-03T20:03:21Z) - Fusing the Old with the New: Learning Relative Camera Pose with
Geometry-Guided Uncertainty [91.0564497403256]
We present a novel framework that involves probabilistic fusion between the two families of predictions during network training.
Our network features a self-attention graph neural network, which drives the learning by enforcing strong interactions between different correspondences.
We propose motion parmeterizations suitable for learning and show that our method achieves state-of-the-art performance on the challenging DeMoN and ScanNet datasets.
arXiv Detail & Related papers (2021-04-16T17:59:06Z) - Plausible Counterfactuals: Auditing Deep Learning Classifiers with
Realistic Adversarial Examples [84.8370546614042]
Black-box nature of Deep Learning models has posed unanswered questions about what they learn from data.
Generative Adversarial Network (GAN) and multi-objectives are used to furnish a plausible attack to the audited model.
Its utility is showcased within a human face classification task, unveiling the enormous potential of the proposed framework.
arXiv Detail & Related papers (2020-03-25T11:08:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.