Uncertainty Estimation by Density Aware Evidential Deep Learning
- URL: http://arxiv.org/abs/2409.08754v1
- Date: Fri, 13 Sep 2024 12:04:45 GMT
- Title: Uncertainty Estimation by Density Aware Evidential Deep Learning
- Authors: Taeseong Yoon, Heeyoung Kim,
- Abstract summary: Evidential deep learning (EDL) has shown remarkable success in uncertainty estimation.
We propose a novel method called Density Aware Evidential Deep Learning (DAEDL)
DAEDL integrates the feature space density of the testing example with the output of EDL during the prediction stage.
It demonstrates state-of-the-art performance across diverse downstream tasks related to uncertainty estimation and classification.
- Score: 7.328039160501826
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Evidential deep learning (EDL) has shown remarkable success in uncertainty estimation. However, there is still room for improvement, particularly in out-of-distribution (OOD) detection and classification tasks. The limited OOD detection performance of EDL arises from its inability to reflect the distance between the testing example and training data when quantifying uncertainty, while its limited classification performance stems from its parameterization of the concentration parameters. To address these limitations, we propose a novel method called Density Aware Evidential Deep Learning (DAEDL). DAEDL integrates the feature space density of the testing example with the output of EDL during the prediction stage, while using a novel parameterization that resolves the issues in the conventional parameterization. We prove that DAEDL enjoys a number of favorable theoretical properties. DAEDL demonstrates state-of-the-art performance across diverse downstream tasks related to uncertainty estimation and classification
Related papers
- Calibrating LLMs with Information-Theoretic Evidential Deep Learning [22.263501423275784]
Fine-tuned large language models (LLMs) often exhibit overconfidence, particularly when trained on small datasets.
Evidential Deep Learning (EDL), an uncertainty-aware approach, enables uncertainty estimation in a single forward pass.
We propose regularizing EDL by incorporating an information bottleneck (IB)
arXiv Detail & Related papers (2025-02-10T11:00:24Z) - Evidential Deep Learning for Uncertainty Quantification and Out-of-Distribution Detection in Jet Identification using Deep Neural Networks [0.6558603851407393]
We use evidential deep learning (EDL) for deep neural network models designed to identify jets in proton-proton collisions at the Large Hadron Collider.
EDL treats learning as an evidence acquisition process designed to provide confidence about test data.
We show how EDL quantifies uncertainty and detects out-of-distribution data which may lead to improved EDL methods for DL models applied to classification tasks.
arXiv Detail & Related papers (2025-01-10T02:14:29Z) - Revisiting Essential and Nonessential Settings of Evidential Deep Learning [70.82728812001807]
Evidential Deep Learning (EDL) is an emerging method for uncertainty estimation.
We propose Re-EDL, a simplified yet more effective variant of EDL.
arXiv Detail & Related papers (2024-10-01T04:27:07Z) - A Comprehensive Survey on Evidential Deep Learning and Its Applications [64.83473301188138]
Evidential Deep Learning (EDL) provides reliable uncertainty estimation with minimal additional computation in a single forward pass.
We first delve into the theoretical foundation of EDL, the subjective logic theory, and discuss its distinctions from other uncertainty estimation frameworks.
We elaborate on its extensive applications across various machine learning paradigms and downstream tasks.
arXiv Detail & Related papers (2024-09-07T05:55:06Z) - Uncertainty Quantification for Forward and Inverse Problems of PDEs via
Latent Global Evolution [110.99891169486366]
We propose a method that integrates efficient and precise uncertainty quantification into a deep learning-based surrogate model.
Our method endows deep learning-based surrogate models with robust and efficient uncertainty quantification capabilities for both forward and inverse problems.
Our method excels at propagating uncertainty over extended auto-regressive rollouts, making it suitable for scenarios involving long-term predictions.
arXiv Detail & Related papers (2024-02-13T11:22:59Z) - Are Uncertainty Quantification Capabilities of Evidential Deep Learning a Mirage? [35.15844215216846]
EDL methods are trained to learn a meta distribution over the predictive distribution by minimizing a specific objective function.
Recent studies identify limitations of the existing methods to conclude their learned uncertainties are unreliable.
We provide a sharper understanding of the behavior of a wide class of EDL methods by unifying various objective functions.
We conclude that even when EDL methods are empirically effective on downstream tasks, this occurs despite their poor uncertainty quantification capabilities.
arXiv Detail & Related papers (2024-02-09T03:23:39Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - Model-Based Uncertainty in Value Functions [89.31922008981735]
We focus on characterizing the variance over values induced by a distribution over MDPs.
Previous work upper bounds the posterior variance over values by solving a so-called uncertainty Bellman equation.
We propose a new uncertainty Bellman equation whose solution converges to the true posterior variance over values.
arXiv Detail & Related papers (2023-02-24T09:18:27Z) - Localized Debiased Machine Learning: Efficient Inference on Quantile
Treatment Effects and Beyond [69.83813153444115]
We consider an efficient estimating equation for the (local) quantile treatment effect ((L)QTE) in causal inference.
Debiased machine learning (DML) is a data-splitting approach to estimating high-dimensional nuisances.
We propose localized debiased machine learning (LDML), which avoids this burdensome step.
arXiv Detail & Related papers (2019-12-30T14:42:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.