Single Shot MC Dropout Approximation
- URL: http://arxiv.org/abs/2007.03293v1
- Date: Tue, 7 Jul 2020 09:17:17 GMT
- Title: Single Shot MC Dropout Approximation
- Authors: Kai Brach, Beate Sick, Oliver D\"urr
- Abstract summary: We present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN.
Our approach is analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal.
We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks (DNNs) are known for their high prediction performance,
especially in perceptual tasks such as object recognition or autonomous
driving. Still, DNNs are prone to yield unreliable predictions when
encountering completely new situations without indicating their uncertainty.
Bayesian variants of DNNs (BDNNs), such as MC dropout BDNNs, do provide
uncertainty measures. However, BDNNs are slow during test time because they
rely on a sampling approach. Here we present a single shot MC dropout
approximation that preserves the advantages of BDNNs without being slower than
a DNN. Our approach is to analytically approximate for each layer in a fully
connected network the expected value and the variance of the MC dropout signal.
We evaluate our approach on different benchmark datasets and a simulated toy
example. We demonstrate that our single shot MC dropout approximation resembles
the point estimate and the uncertainty estimate of the predictive distribution
that is achieved with an MC approach, while being fast enough for real-time
deployments of BDNNs.
Related papers
- Uncertainty in Graph Neural Networks: A Survey [50.63474656037679]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.
However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.
This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - Single-shot Bayesian approximation for neural networks [0.0]
Deep neural networks (NNs) are known for their high-prediction performances.
NNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty.
We present a single-shot MC dropout approximation that preserves the advantages of BNNs while being as fast as NNs.
arXiv Detail & Related papers (2023-08-24T13:40:36Z) - An Anomaly Detection Method for Satellites Using Monte Carlo Dropout [7.848121055546167]
We present a tractable approximation for BNN based on the Monte Carlo (MC) dropout method for capturing the uncertainty in the satellite telemetry time series.
Our proposed time series AD approach outperforms the existing methods from both prediction accuracy and AD perspectives.
arXiv Detail & Related papers (2022-11-27T21:12:26Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Neighborhood Spatial Aggregation MC Dropout for Efficient
Uncertainty-aware Semantic Segmentation in Point Clouds [8.98036662506975]
Uncertainty-aware semantic segmentation of point clouds includes the predictive uncertainty estimation and the uncertainty-guided model optimization.
The widely-used MC dropout establishes the distribution by computing the standard deviation of samples using multiple forward propagations.
A framework embedded with NSA-MC dropout, a variant of MC dropout, is proposed to establish distributions in just one forward pass.
arXiv Detail & Related papers (2021-12-05T02:22:32Z) - S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural
Networks via Guided Distribution Calibration [74.5509794733707]
We present a novel guided learning paradigm from real-valued to distill binary networks on the final prediction distribution.
Our proposed method can boost the simple contrastive learning baseline by an absolute gain of 5.515% on BNNs.
Our method achieves substantial improvement over the simple contrastive learning baseline, and is even comparable to many mainstream supervised BNN methods.
arXiv Detail & Related papers (2021-02-17T18:59:28Z) - Probabilistic Neighbourhood Component Analysis: Sample Efficient
Uncertainty Estimation in Deep Learning [25.8227937350516]
We show that uncertainty estimation capability of state-of-the-art BNNs and Deep Ensemble models degrades significantly when the amount of training data is small.
We propose a probabilistic generalization of the popular sample-efficient non-parametric kNN approach.
Our approach enables deep kNN to accurately quantify underlying uncertainties in its prediction.
arXiv Detail & Related papers (2020-07-18T21:36:31Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Frequentist Uncertainty in Recurrent Neural Networks via Blockwise
Influence Functions [121.10450359856242]
Recurrent neural networks (RNNs) are instrumental in modelling sequential and time-series data.
Existing approaches for uncertainty quantification in RNNs are based predominantly on Bayesian methods.
We develop a frequentist alternative that: (a) does not interfere with model training or compromise its accuracy, (b) applies to any RNN architecture, and (c) provides theoretical coverage guarantees on the estimated uncertainty intervals.
arXiv Detail & Related papers (2020-06-20T22:45:32Z) - Interval Neural Networks: Uncertainty Scores [11.74565957328407]
We propose a fast, non-Bayesian method for producing uncertainty scores in the output of pre-trained deep neural networks (DNNs)
This interval neural network (INN) has interval valued parameters and propagates its input using interval arithmetic.
In numerical experiments on an image reconstruction task, we demonstrate the practical utility of INNs as a proxy for the prediction error.
arXiv Detail & Related papers (2020-03-25T18:03:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.