Deep Probability Estimation
- URL: http://arxiv.org/abs/2111.10734v1
- Date: Sun, 21 Nov 2021 03:55:50 GMT
- Title: Deep Probability Estimation
- Authors: Sheng Liu, Aakash Kaku, Weicheng Zhu, Matan Leibovich, Sreyas Mohan,
Boyang Yu, Laure Zanna, Narges Razavian, Carlos Fernandez-Granda
- Abstract summary: We investigate probability estimation from high-dimensional data using deep neural networks.
The goal of this work is to investigate probability estimation from high-dimensional data using deep neural networks.
We evaluate existing methods on the synthetic data as well as on three real-world probability estimation tasks.
- Score: 14.659180336823354
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reliable probability estimation is of crucial importance in many real-world
applications where there is inherent uncertainty, such as weather forecasting,
medical prognosis, or collision avoidance in autonomous vehicles.
Probability-estimation models are trained on observed outcomes (e.g. whether it
has rained or not, or whether a patient has died or not), because the
ground-truth probabilities of the events of interest are typically unknown. The
problem is therefore analogous to binary classification, with the important
difference that the objective is to estimate probabilities rather than
predicting the specific outcome. The goal of this work is to investigate
probability estimation from high-dimensional data using deep neural networks.
There exist several methods to improve the probabilities generated by these
models but they mostly focus on classification problems where the probabilities
are related to model uncertainty. In the case of problems with inherent
uncertainty, it is challenging to evaluate performance without access to
ground-truth probabilities. To address this, we build a synthetic dataset to
study and compare different computable metrics. We evaluate existing methods on
the synthetic data as well as on three real-world probability estimation tasks,
all of which involve inherent uncertainty: precipitation forecasting from radar
images, predicting cancer patient survival from histopathology images, and
predicting car crashes from dashcam videos. Finally, we also propose a new
method for probability estimation using neural networks, which modifies the
training process to promote output probabilities that are consistent with
empirical probabilities computed from the data. The method outperforms existing
approaches on most metrics on the simulated as well as real-world data.
Related papers
- Deep Ensembles Meets Quantile Regression: Uncertainty-aware Imputation for Time Series [45.76310830281876]
We propose Quantile Sub-Ensembles, a novel method to estimate uncertainty with ensemble of quantile-regression-based task networks.
Our method not only produces accurate imputations that is robust to high missing rates, but also is computationally efficient due to the fast training of its non-generative model.
arXiv Detail & Related papers (2023-12-03T05:52:30Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Toward Robust Uncertainty Estimation with Random Activation Functions [3.0586855806896045]
We propose a novel approach for uncertainty quantification via ensembles, called Random Activation Functions (RAFs) Ensemble.
RAFs Ensemble outperforms state-of-the-art ensemble uncertainty quantification methods on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-02-28T13:17:56Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Uncertainty Modeling for Out-of-Distribution Generalization [56.957731893992495]
We argue that the feature statistics can be properly manipulated to improve the generalization ability of deep learning models.
Common methods often consider the feature statistics as deterministic values measured from the learned features.
We improve the network generalization ability by modeling the uncertainty of domain shifts with synthesized feature statistics during training.
arXiv Detail & Related papers (2022-02-08T16:09:12Z) - Probabilistic Deep Learning to Quantify Uncertainty in Air Quality
Forecasting [5.007231239800297]
This work applies state-of-the-art techniques of uncertainty quantification in a real-world setting of air quality forecasts.
We describe training probabilistic models and evaluate their predictive uncertainties based on empirical performance, reliability of confidence estimate, and practical applicability.
Our experiments demonstrate that the proposed models perform better than previous works in quantifying uncertainty in data-driven air quality forecasts.
arXiv Detail & Related papers (2021-12-05T17:01:18Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Learning Probabilistic Ordinal Embeddings for Uncertainty-Aware
Regression [91.3373131262391]
Uncertainty is the only certainty there is.
Traditionally, the direct regression formulation is considered and the uncertainty is modeled by modifying the output space to a certain family of probabilistic distributions.
How to model the uncertainty within the present-day technologies for regression remains an open issue.
arXiv Detail & Related papers (2021-03-25T06:56:09Z) - Handling Epistemic and Aleatory Uncertainties in Probabilistic Circuits [18.740781076082044]
We propose an approach to overcome the independence assumption behind most of the approaches dealing with a large class of probabilistic reasoning.
We provide an algorithm for Bayesian learning from sparse, albeit complete, observations.
Each leaf of such circuits is labelled with a beta-distributed random variable that provides us with an elegant framework for representing uncertain probabilities.
arXiv Detail & Related papers (2021-02-22T10:03:15Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Uncertainty-Gated Stochastic Sequential Model for EHR Mortality
Prediction [6.170898159041278]
We present a novel variational recurrent network that estimates the distribution of missing variables, updates hidden states, and predicts the possibility of in-hospital mortality.
It is noteworthy that our model can conduct these procedures in a single stream and learn all network parameters jointly in an end-to-end manner.
arXiv Detail & Related papers (2020-03-02T04:41:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.