Multidimensional Uncertainty-Aware Evidential Neural Networks
- URL: http://arxiv.org/abs/2012.13676v1
- Date: Sat, 26 Dec 2020 04:28:56 GMT
- Title: Multidimensional Uncertainty-Aware Evidential Neural Networks
- Authors: Yibo Hu, Yuzhe Ou, Xujiang Zhao, Jin-Hee Cho, Feng Chen
- Abstract summary: We propose a novel uncertainty-aware evidential NN called WGAN-ENN (WENN) for solving an out-of-versa (OOD) detection problem.
We took a hybrid approach that combines Wasserstein Generative Adrial Network (WGAN) with ENNs to jointly train a model with prior knowledge of a certain class.
We demonstrated that the estimation of uncertainty by WENN can significantly help distinguish OOD samples from boundary samples.
- Score: 21.716045815385268
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traditional deep neural networks (NNs) have significantly contributed to the
state-of-the-art performance in the task of classification under various
application domains. However, NNs have not considered inherent uncertainty in
data associated with the class probabilities where misclassification under
uncertainty may easily introduce high risk in decision making in real-world
contexts (e.g., misclassification of objects in roads leads to serious
accidents). Unlike Bayesian NN that indirectly infer uncertainty through weight
uncertainties, evidential NNs (ENNs) have been recently proposed to explicitly
model the uncertainty of class probabilities and use them for classification
tasks. An ENN offers the formulation of the predictions of NNs as subjective
opinions and learns the function by collecting an amount of evidence that can
form the subjective opinions by a deterministic NN from data. However, the ENN
is trained as a black box without explicitly considering inherent uncertainty
in data with their different root causes, such as vacuity (i.e., uncertainty
due to a lack of evidence) or dissonance (i.e., uncertainty due to conflicting
evidence). By considering the multidimensional uncertainty, we proposed a novel
uncertainty-aware evidential NN called WGAN-ENN (WENN) for solving an
out-of-distribution (OOD) detection problem. We took a hybrid approach that
combines Wasserstein Generative Adversarial Network (WGAN) with ENNs to jointly
train a model with prior knowledge of a certain class, which has high vacuity
for OOD samples. Via extensive empirical experiments based on both synthetic
and real-world datasets, we demonstrated that the estimation of uncertainty by
WENN can significantly help distinguish OOD samples from boundary samples. WENN
outperformed in OOD detection when compared with other competitive
counterparts.
Related papers
- Uncertainty in Graph Neural Networks: A Survey [50.63474656037679]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.
However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.
This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - Uncertainty in Natural Language Processing: Sources, Quantification, and
Applications [56.130945359053776]
We provide a comprehensive review of uncertainty-relevant works in the NLP field.
We first categorize the sources of uncertainty in natural language into three types, including input, system, and output.
We discuss the challenges of uncertainty estimation in NLP and discuss potential future directions.
arXiv Detail & Related papers (2023-06-05T06:46:53Z) - Single-model uncertainty quantification in neural network potentials
does not consistently outperform model ensembles [0.7499722271664145]
Neural networks (NNs) often assign high confidence to their predictions, even for points far out-of-distribution.
Uncertainty quantification (UQ) is a challenge when they are employed to model interatomic potentials in materials systems.
Differentiable UQ techniques can find new informative data and drive active learning loops for robust potentials.
arXiv Detail & Related papers (2023-05-02T19:41:17Z) - Uncertainty Propagation in Node Classification [9.03984964980373]
We focus on measuring uncertainty of graph neural networks (GNNs) for the task of node classification.
We propose a Bayesian uncertainty propagation (BUP) method, which embeds GNNs in a Bayesian modeling framework.
We present an uncertainty oriented loss for node classification that allows the GNNs to clearly integrate predictive uncertainty in learning procedure.
arXiv Detail & Related papers (2023-04-03T12:18:23Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - Evaluating Point-Prediction Uncertainties in Neural Networks for Drug
Discovery [0.26385121748044166]
Neural Network (NN) models provide potential to speed up the drug discovery process and reduce its failure rates.
The success of NN models require uncertainty quantification (UQ) as drug discovery explores chemical space beyond the training data distribution.
In this paper, we examine UQ methods that estimate different sources of predictive uncertainty for NN models aiming at drug discovery.
arXiv Detail & Related papers (2022-10-31T03:45:11Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - Uncertainty-Aware Reliable Text Classification [21.517852608625127]
Deep neural networks have significantly contributed to the success in predictive accuracy for classification tasks.
They tend to make over-confident predictions in real-world settings, where domain shifting and out-of-distribution examples exist.
We propose an inexpensive framework that adopts both auxiliary outliers and pseudo off-manifold samples to train the model with prior knowledge of a certain class.
arXiv Detail & Related papers (2021-07-15T04:39:55Z) - The Hidden Uncertainty in a Neural Networks Activations [105.4223982696279]
The distribution of a neural network's latent representations has been successfully used to detect out-of-distribution (OOD) data.
This work investigates whether this distribution correlates with a model's epistemic uncertainty, thus indicating its ability to generalise to novel inputs.
arXiv Detail & Related papers (2020-12-05T17:30:35Z) - Attribute-Guided Adversarial Training for Robustness to Natural
Perturbations [64.35805267250682]
We propose an adversarial training approach which learns to generate new samples so as to maximize exposure of the classifier to the attributes-space.
Our approach enables deep neural networks to be robust against a wide range of naturally occurring perturbations.
arXiv Detail & Related papers (2020-12-03T10:17:30Z) - Uncertainty Aware Semi-Supervised Learning on Graph Data [18.695343563823798]
We propose a multi-source uncertainty framework using a graph neural network (GNN) for node classification predictions.
By collecting evidence from the labels of training nodes, the Graph-based Kernel Dirichlet distribution Estimation (GKDE) method is designed for accurately predicting node-level Dirichlet distributions.
We found that dissonance-based detection yielded the best results on misclassification detection while vacuity-based detection was the best for OOD detection.
arXiv Detail & Related papers (2020-10-24T04:56:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.