Density Regression and Uncertainty Quantification with Bayesian Deep
Noise Neural Networks
- URL: http://arxiv.org/abs/2206.05643v1
- Date: Sun, 12 Jun 2022 02:47:29 GMT
- Title: Density Regression and Uncertainty Quantification with Bayesian Deep
Noise Neural Networks
- Authors: Daiwei Zhang, Tianci Liu, Jian Kang
- Abstract summary: Deep neural network (DNN) models have achieved state-of-the-art predictive accuracy in a wide range of supervised learning applications.
accurately quantifying the uncertainty in DNN predictions remains a challenging task.
We propose the Bayesian Deep Noise Neural Network (B-DeepNoise), which generalizes standard Bayesian DNNs by extending the random noise variable to all hidden layers.
We evaluate B-DeepNoise against existing methods on benchmark regression datasets, demonstrating its superior performance in terms of prediction accuracy, uncertainty quantification accuracy, and uncertainty quantification efficiency.
- Score: 4.376565880192482
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep neural network (DNN) models have achieved state-of-the-art predictive
accuracy in a wide range of supervised learning applications. However,
accurately quantifying the uncertainty in DNN predictions remains a challenging
task. For continuous outcome variables, an even more difficult problem is to
estimate the predictive density function, which not only provides a natural
quantification of the predictive uncertainty, but also fully captures the
random variation in the outcome. In this work, we propose the Bayesian Deep
Noise Neural Network (B-DeepNoise), which generalizes standard Bayesian DNNs by
extending the random noise variable from the output layer to all hidden layers.
The latent random noise equips B-DeepNoise with the flexibility to approximate
highly complex predictive distributions and accurately quantify predictive
uncertainty. For posterior computation, the unique structure of B-DeepNoise
leads to a closed-form Gibbs sampling algorithm that iteratively simulates from
the posterior full conditional distributions of the model parameters,
circumventing computationally intensive Metropolis-Hastings methods. A
theoretical analysis of B-DeepNoise establishes a recursive representation of
the predictive distribution and decomposes the predictive variance with respect
to the latent parameters. We evaluate B-DeepNoise against existing methods on
benchmark regression datasets, demonstrating its superior performance in terms
of prediction accuracy, uncertainty quantification accuracy, and uncertainty
quantification efficiency. To illustrate our method's usefulness in scientific
studies, we apply B-DeepNoise to predict general intelligence from neuroimaging
features in the Adolescent Brain Cognitive Development (ABCD) project.
Related papers
- Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Improved uncertainty quantification for neural networks with Bayesian
last layer [0.0]
Uncertainty quantification is an important task in machine learning.
We present a reformulation of the log-marginal likelihood of a NN with BLL which allows for efficient training using backpropagation.
arXiv Detail & Related papers (2023-02-21T20:23:56Z) - A Benchmark on Uncertainty Quantification for Deep Learning Prognostics [0.0]
We assess some of the latest developments in the field of uncertainty quantification for prognostics deep learning.
This includes the state-of-the-art variational inference algorithms for Bayesian neural networks (BNN) as well as popular alternatives such as Monte Carlo Dropout (MCD), deep ensembles (DE) and heteroscedastic neural networks (HNN)
The performance of the methods is evaluated on a subset of the large NASA NCMAPSS dataset for aircraft engines.
arXiv Detail & Related papers (2023-02-09T16:12:47Z) - Simulator-Based Inference with Waldo: Confidence Regions by Leveraging
Prediction Algorithms and Posterior Estimators for Inverse Problems [4.212344009251363]
WALDO is a novel method to construct confidence regions with finite-sample conditional validity.
We apply our method to a recent high-energy physics problem, where prediction with deep neural networks has previously led to estimates with prediction bias.
arXiv Detail & Related papers (2022-05-31T10:43:18Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Calibration and Uncertainty Quantification of Bayesian Convolutional
Neural Networks for Geophysical Applications [0.0]
It is common to incorporate the uncertainty of predictions such subsurface models should provide calibrated probabilities and the associated uncertainties in their predictions.
It has been shown that popular Deep Learning-based models are often miscalibrated, and due to their deterministic nature, provide no means to interpret the uncertainty of their predictions.
We compare three different approaches obtaining probabilistic models based on convolutional neural networks in a Bayesian formalism.
arXiv Detail & Related papers (2021-05-25T17:54:23Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Probabilistic Neighbourhood Component Analysis: Sample Efficient
Uncertainty Estimation in Deep Learning [25.8227937350516]
We show that uncertainty estimation capability of state-of-the-art BNNs and Deep Ensemble models degrades significantly when the amount of training data is small.
We propose a probabilistic generalization of the popular sample-efficient non-parametric kNN approach.
Our approach enables deep kNN to accurately quantify underlying uncertainties in its prediction.
arXiv Detail & Related papers (2020-07-18T21:36:31Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.