Uncertain Bayesian Networks: Learning from Incomplete Data
- URL: http://arxiv.org/abs/2208.04221v1
- Date: Mon, 8 Aug 2022 15:46:44 GMT
- Title: Uncertain Bayesian Networks: Learning from Incomplete Data
- Authors: Conrad D. Hougen, Lance M. Kaplan, Federico Cerutti, Alfred O. Hero
III
- Abstract summary: When historical data are limited, the conditional probabilities associated with the nodes of Bayesian networks are uncertain.
Second order estimation methods provide a framework for both estimating the probabilities and quantifying the uncertainty.
We evaluate various methods to learn the posterior of the parameters through the desired and empirically derived strength of confidence bounds for various queries.
- Score: 30.09565247029203
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: When the historical data are limited, the conditional probabilities
associated with the nodes of Bayesian networks are uncertain and can be
empirically estimated. Second order estimation methods provide a framework for
both estimating the probabilities and quantifying the uncertainty in these
estimates. We refer to these cases as uncer tain or second-order Bayesian
networks. When such data are complete, i.e., all variable values are observed
for each instantiation, the conditional probabilities are known to be
Dirichlet-distributed. This paper improves the current state-of-the-art
approaches for handling uncertain Bayesian networks by enabling them to learn
distributions for their parameters, i.e., conditional probabilities, with
incomplete data. We extensively evaluate various methods to learn the posterior
of the parameters through the desired and empirically derived strength of
confidence bounds for various queries.
Related papers
- Ensemble Neural Networks for Remaining Useful Life (RUL) Prediction [0.39287497907611874]
A core part of maintenance planning is a monitoring system that provides a good prognosis on health and degradation.
Here, we propose ensemble neural networks for probabilistic RUL predictions which considers both uncertainties and decouples these two uncertainties.
This method is tested on NASA's turbofan jet engine CMAPSS data-set.
arXiv Detail & Related papers (2023-09-21T19:38:44Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Reliability-Aware Prediction via Uncertainty Learning for Person Image
Retrieval [51.83967175585896]
UAL aims at providing reliability-aware predictions by considering data uncertainty and model uncertainty simultaneously.
Data uncertainty captures the noise" inherent in the sample, while model uncertainty depicts the model's confidence in the sample's prediction.
arXiv Detail & Related papers (2022-10-24T17:53:20Z) - Uncertainty Modeling for Out-of-Distribution Generalization [56.957731893992495]
We argue that the feature statistics can be properly manipulated to improve the generalization ability of deep learning models.
Common methods often consider the feature statistics as deterministic values measured from the learned features.
We improve the network generalization ability by modeling the uncertainty of domain shifts with synthesized feature statistics during training.
arXiv Detail & Related papers (2022-02-08T16:09:12Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Deep Probability Estimation [14.659180336823354]
We investigate probability estimation from high-dimensional data using deep neural networks.
The goal of this work is to investigate probability estimation from high-dimensional data using deep neural networks.
We evaluate existing methods on the synthetic data as well as on three real-world probability estimation tasks.
arXiv Detail & Related papers (2021-11-21T03:55:50Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Handling Epistemic and Aleatory Uncertainties in Probabilistic Circuits [18.740781076082044]
We propose an approach to overcome the independence assumption behind most of the approaches dealing with a large class of probabilistic reasoning.
We provide an algorithm for Bayesian learning from sparse, albeit complete, observations.
Each leaf of such circuits is labelled with a beta-distributed random variable that provides us with an elegant framework for representing uncertain probabilities.
arXiv Detail & Related papers (2021-02-22T10:03:15Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Posterior Network: Uncertainty Estimation without OOD Samples via
Density-Based Pseudo-Counts [33.45069308137142]
Posterior Network (PostNet) predicts an individual closed-form posterior distribution over predicted probabilites for any input sample.
PostNet achieves state-of-the art results in OOD detection and in uncertainty calibration under dataset shifts.
arXiv Detail & Related papers (2020-06-16T15:16:32Z) - Uncertainty Estimation for End-To-End Learned Dense Stereo Matching via
Probabilistic Deep Learning [0.0]
A novel probabilistic neural network is presented for the task of joint depth and uncertainty estimation from epipolar rectified stereo image pairs.
The network learns a probability distribution from which parameters are sampled for every prediction.
The quality of the estimated depth and uncertainty information is assessed in an extensive evaluation on three different datasets.
arXiv Detail & Related papers (2020-02-10T11:27:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.