Hyper-Diffusion: Estimating Epistemic and Aleatoric Uncertainty with a
Single Model
- URL: http://arxiv.org/abs/2402.03478v1
- Date: Mon, 5 Feb 2024 19:39:52 GMT
- Title: Hyper-Diffusion: Estimating Epistemic and Aleatoric Uncertainty with a
Single Model
- Authors: Matthew A. Chan, Maria J. Molina, Christopher A. Metzler
- Abstract summary: We introduce a new approach to ensembling, hyper-diffusion, which allows one to accurately estimate uncertainty with a single model.
We validate our approach on two distinct tasks: x-ray computed tomography (CT) reconstruction and weather temperature forecasting.
- Score: 6.5990719141691825
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Estimating and disentangling epistemic uncertainty (uncertainty that can be
reduced with more training data) and aleatoric uncertainty (uncertainty that is
inherent to the task at hand) is critically important when applying machine
learning (ML) to high-stakes applications such as medical imaging and weather
forecasting. Conditional diffusion models' breakthrough ability to accurately
and efficiently sample from the posterior distribution of a dataset now makes
uncertainty estimation conceptually straightforward: One need only train and
sample from a large ensemble of diffusion models. Unfortunately, training such
an ensemble becomes computationally intractable as the complexity of the model
architecture grows.
In this work we introduce a new approach to ensembling, hyper-diffusion,
which allows one to accurately estimate epistemic and aleatoric uncertainty
with a single model. Unlike existing Monte Carlo dropout based single-model
ensembling methods, hyper-diffusion offers the same prediction accuracy as
multi-model ensembles. We validate our approach on two distinct tasks: x-ray
computed tomography (CT) reconstruction and weather temperature forecasting.
Related papers
- Shedding Light on Large Generative Networks: Estimating Epistemic Uncertainty in Diffusion Models [15.352556466952477]
Generative diffusion models are notable for their large parameter count (exceeding 100 million) and operation within high-dimensional image spaces.
We introduce an innovative framework, Diffusion Ensembles for Capturing Uncertainty (DECU), designed for estimating epistemic uncertainty for diffusion models.
arXiv Detail & Related papers (2024-06-05T14:03:21Z) - Neural parameter calibration and uncertainty quantification for epidemic
forecasting [0.0]
We apply a novel and powerful computational method to the problem of learning probability densities on contagion parameters.
Using a neural network, we calibrate an ODE model to data of the spread of COVID-19 in Berlin in 2020.
We show convergence of our method to the true posterior on a simplified SIR model of epidemics, and also demonstrate our method's learning capabilities on a reduced dataset.
arXiv Detail & Related papers (2023-12-05T21:34:59Z) - Deep Ensembles Meets Quantile Regression: Uncertainty-aware Imputation
for Time Series [49.992908221544624]
Time series data often exhibit numerous missing values, which is the time series imputation task.
Previous deep learning methods have been shown to be effective for time series imputation.
We propose a non-generative time series imputation method that produces accurate imputations with inherent uncertainty.
arXiv Detail & Related papers (2023-12-03T05:52:30Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Measuring and Modeling Uncertainty Degree for Monocular Depth Estimation [50.920911532133154]
The intrinsic ill-posedness and ordinal-sensitive nature of monocular depth estimation (MDE) models pose major challenges to the estimation of uncertainty degree.
We propose to model the uncertainty of MDE models from the perspective of the inherent probability distributions.
By simply introducing additional training regularization terms, our model, with surprisingly simple formations and without requiring extra modules or multiple inferences, can provide uncertainty estimations with state-of-the-art reliability.
arXiv Detail & Related papers (2023-07-19T12:11:15Z) - Reconstructing Graph Diffusion History from a Single Snapshot [87.20550495678907]
We propose a novel barycenter formulation for reconstructing Diffusion history from A single SnapsHot (DASH)
We prove that estimation error of diffusion parameters is unavoidable due to NP-hardness of diffusion parameter estimation.
We also develop an effective solver named DIffusion hiTting Times with Optimal proposal (DITTO)
arXiv Detail & Related papers (2023-06-01T09:39:32Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Learning Multivariate CDFs and Copulas using Tensor Factorization [39.24470798045442]
Learning the multivariate distribution of data is a core challenge in statistics and machine learning.
In this work, we aim to learn multivariate cumulative distribution functions (CDFs), as they can handle mixed random variables.
We show that any grid sampled version of a joint CDF of mixed random variables admits a universal representation as a naive Bayes model.
We demonstrate the superior performance of the proposed model in several synthetic and real datasets and applications including regression, sampling and data imputation.
arXiv Detail & Related papers (2022-10-13T16:18:46Z) - Uncertainty Quantification for Traffic Forecasting: A Unified Approach [21.556559649467328]
Uncertainty is an essential consideration for time series forecasting tasks.
In this work, we focus on quantifying the uncertainty of traffic forecasting.
We develop Deep S-Temporal Uncertainty Quantification (STUQ), which can estimate both aleatoric and relational uncertainty.
arXiv Detail & Related papers (2022-08-11T15:21:53Z) - Improving Trustworthiness of AI Disease Severity Rating in Medical
Imaging with Ordinal Conformal Prediction Sets [0.7734726150561088]
A lack of statistically rigorous uncertainty quantification is a significant factor undermining trust in AI results.
Recent developments in distribution-free uncertainty quantification present practical solutions for these issues.
We demonstrate a technique for forming ordinal prediction sets that are guaranteed to contain the correct stenosis severity.
arXiv Detail & Related papers (2022-07-05T18:01:20Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.