Variational Inference for Bayesian Neural Networks under Model and
Parameter Uncertainty
- URL: http://arxiv.org/abs/2305.00934v1
- Date: Mon, 1 May 2023 16:38:17 GMT
- Title: Variational Inference for Bayesian Neural Networks under Model and
Parameter Uncertainty
- Authors: Aliaksandr Hubin and Geir Storvik
- Abstract summary: We apply the concept of model uncertainty as a framework for structural learning in BNNs.
We suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities.
- Score: 12.211659310564425
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian neural networks (BNNs) have recently regained a significant amount
of attention in the deep learning community due to the development of scalable
approximate Bayesian inference techniques. There are several advantages of
using a Bayesian approach: Parameter and prediction uncertainties become easily
available, facilitating rigorous statistical analysis. Furthermore, prior
knowledge can be incorporated. However, so far, there have been no scalable
techniques capable of combining both structural and parameter uncertainty. In
this paper, we apply the concept of model uncertainty as a framework for
structural learning in BNNs and hence make inference in the joint space of
structures/models and parameters. Moreover, we suggest an adaptation of a
scalable variational inference approach with reparametrization of marginal
inclusion probabilities to incorporate the model space constraints.
Experimental results on a range of benchmark datasets show that we obtain
comparable accuracy results with the competing models, but based on methods
that are much more sparse than ordinary BNNs.
Related papers
- Variational Bayesian Bow tie Neural Networks with Shrinkage [0.276240219662896]
We build a relaxed version of the standard feed-forward rectified neural network.
We employ Polya-Gamma data augmentation tricks to render a conditionally linear and Gaussian model.
We derive a variational inference algorithm that avoids distributional assumptions and independence across layers.
arXiv Detail & Related papers (2024-11-17T17:36:30Z) - Empowering Bayesian Neural Networks with Functional Priors through Anchored Ensembling for Mechanics Surrogate Modeling Applications [0.0]
We present a novel BNN training scheme based on anchored ensembling that can integrate a priori information available in the function space.
The anchoring scheme makes use of low-rank correlations between NN parameters, learnt from pre-training to realizations of the functional prior.
We also perform a study to demonstrate how correlations between NN weights, which are often neglected in existing BNN implementations, is critical to appropriately transfer knowledge between the function-space and parameter-space priors.
arXiv Detail & Related papers (2024-09-08T22:27:50Z) - A Framework for Variational Inference of Lightweight Bayesian Neural
Networks with Heteroscedastic Uncertainties [0.31457219084519006]
We show that both the heteroscedastic aleatoric and epistemic variance can be embedded into the variances of learned BNN parameters.
We introduce a relatively simple framework for sampling-free variational inference suitable for lightweight BNNs.
arXiv Detail & Related papers (2024-02-22T13:24:43Z) - The Risk of Federated Learning to Skew Fine-Tuning Features and
Underperform Out-of-Distribution Robustness [50.52507648690234]
Federated learning has the risk of skewing fine-tuning features and compromising the robustness of the model.
We introduce three robustness indicators and conduct experiments across diverse robust datasets.
Our approach markedly enhances the robustness across diverse scenarios, encompassing various parameter-efficient fine-tuning methods.
arXiv Detail & Related papers (2024-01-25T09:18:51Z) - Uncertainty Quantification in Multivariable Regression for Material Property Prediction with Bayesian Neural Networks [37.69303106863453]
We introduce an approach for uncertainty quantification (UQ) within physics-informed BNNs.
We present case studies for predicting the creep rupture life of steel alloys.
The most promising framework for creep life prediction is BNNs based on Markov Chain Monte Carlo approximation of the posterior distribution of network parameters.
arXiv Detail & Related papers (2023-11-04T19:40:16Z) - A Variational Bayesian Approach to Learning Latent Variables for
Acoustic Knowledge Transfer [55.20627066525205]
We propose a variational Bayesian (VB) approach to learning distributions of latent variables in deep neural network (DNN) models.
Our proposed VB approach can obtain good improvements on target devices, and consistently outperforms 13 state-of-the-art knowledge transfer algorithms.
arXiv Detail & Related papers (2021-10-16T15:54:01Z) - Post-mortem on a deep learning contest: a Simpson's paradox and the
complementary roles of scale metrics versus shape metrics [61.49826776409194]
We analyze a corpus of models made publicly-available for a contest to predict the generalization accuracy of neural network (NN) models.
We identify what amounts to a Simpson's paradox: where "scale" metrics perform well overall but perform poorly on sub partitions of the data.
We present two novel shape metrics, one data-independent, and the other data-dependent, which can predict trends in the test accuracy of a series of NNs.
arXiv Detail & Related papers (2021-06-01T19:19:49Z) - Calibration and Uncertainty Quantification of Bayesian Convolutional
Neural Networks for Geophysical Applications [0.0]
It is common to incorporate the uncertainty of predictions such subsurface models should provide calibrated probabilities and the associated uncertainties in their predictions.
It has been shown that popular Deep Learning-based models are often miscalibrated, and due to their deterministic nature, provide no means to interpret the uncertainty of their predictions.
We compare three different approaches obtaining probabilistic models based on convolutional neural networks in a Bayesian formalism.
arXiv Detail & Related papers (2021-05-25T17:54:23Z) - Explaining and Improving Model Behavior with k Nearest Neighbor
Representations [107.24850861390196]
We propose using k nearest neighbor representations to identify training examples responsible for a model's predictions.
We show that kNN representations are effective at uncovering learned spurious associations.
Our results indicate that the kNN approach makes the finetuned model more robust to adversarial inputs.
arXiv Detail & Related papers (2020-10-18T16:55:25Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Efficient Ensemble Model Generation for Uncertainty Estimation with
Bayesian Approximation in Segmentation [74.06904875527556]
We propose a generic and efficient segmentation framework to construct ensemble segmentation models.
In the proposed method, ensemble models can be efficiently generated by using the layer selection method.
We also devise a new pixel-wise uncertainty loss, which improves the predictive performance.
arXiv Detail & Related papers (2020-05-21T16:08:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.