A Simple and Effective Method for Uncertainty Quantification and OOD Detection
- URL: http://arxiv.org/abs/2508.00754v1
- Date: Fri, 01 Aug 2025 16:31:23 GMT
- Title: A Simple and Effective Method for Uncertainty Quantification and OOD Detection
- Authors: Yaxin Ma, Benjamin Colburn, Jose C. Principe,
- Abstract summary: We propose an effective method based on feature space density to quantify uncertainty for distributional shifts.<n>Specifically, we leverage the information potential field derived from kernel density estimation to approximate the feature space density of the training set.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Bayesian neural networks and deep ensemble methods have been proposed for uncertainty quantification; however, they are computationally intensive and require large storage. By utilizing a single deterministic model, we can solve the above issue. We propose an effective method based on feature space density to quantify uncertainty for distributional shifts and out-of-distribution (OOD) detection. Specifically, we leverage the information potential field derived from kernel density estimation to approximate the feature space density of the training set. By comparing this density with the feature space representation of test samples, we can effectively determine whether a distributional shift has occurred. Experiments were conducted on a 2D synthetic dataset (Two Moons and Three Spirals) as well as an OOD detection task (CIFAR-10 vs. SVHN). The results demonstrate that our method outperforms baseline models.
Related papers
- Embedding Trajectory for Out-of-Distribution Detection in Mathematical Reasoning [50.84938730450622]
We propose a trajectory-based method TV score, which uses trajectory volatility for OOD detection in mathematical reasoning.
Our method outperforms all traditional algorithms on GLMs under mathematical reasoning scenarios.
Our method can be extended to more applications with high-density features in output spaces, such as multiple-choice questions.
arXiv Detail & Related papers (2024-05-22T22:22:25Z) - High-dimensional and Permutation Invariant Anomaly Detection [0.1450405446885067]
We introduce a permutation-invariant density estimator for particle physics data based on diffusion models.
We demonstrate the efficacy of our methodology by utilizing the learned density as a permutation-invariant anomaly detection score.
arXiv Detail & Related papers (2023-06-06T18:01:03Z) - Machine-Learned Exclusion Limits without Binning [0.0]
We extend the Machine-Learned Likelihoods (MLL) method by including Kernel Density Estimators (KDE) to extract one-dimensional signal and background probability density functions.
We apply the method to two cases of interest at the LHC: a search for exotic Higgs bosons, and a $Z'$ boson decaying into lepton pairs.
arXiv Detail & Related papers (2022-11-09T11:04:50Z) - Out-of-Distribution Detection with Class Ratio Estimation [4.930817402876787]
Density-based Out-of-distribution (OOD) detection has recently been shown unreliable for the task of detecting OOD images.
We propose to unify density ratio based methods under a novel framework that builds energy-based models and employs differing base distributions.
arXiv Detail & Related papers (2022-06-08T15:20:49Z) - Meta Learning Low Rank Covariance Factors for Energy-Based Deterministic
Uncertainty [58.144520501201995]
Bi-Lipschitz regularization of neural network layers preserve relative distances between data instances in the feature spaces of each layer.
With the use of an attentive set encoder, we propose to meta learn either diagonal or diagonal plus low-rank factors to efficiently construct task specific covariance matrices.
We also propose an inference procedure which utilizes scaled energy to achieve a final predictive distribution.
arXiv Detail & Related papers (2021-10-12T22:04:19Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Meta-Learning for Relative Density-Ratio Estimation [59.75321498170363]
Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities.
We propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets.
We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
arXiv Detail & Related papers (2021-07-02T02:13:45Z) - Uncertainty-Aware Deep Calibrated Salient Object Detection [74.58153220370527]
Existing deep neural network based salient object detection (SOD) methods mainly focus on pursuing high network accuracy.
These methods overlook the gap between network accuracy and prediction confidence, known as the confidence uncalibration problem.
We introduce an uncertaintyaware deep SOD network, and propose two strategies to prevent deep SOD networks from being overconfident.
arXiv Detail & Related papers (2020-12-10T23:28:36Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z) - Anomaly Detection with Density Estimation [2.0813318162800707]
We propose a new unsupervised anomaly detection technique (ANODE)
By estimating the probability density of the data in a signal region and in sidebands, a likelihood ratio of data vs. background can be constructed.
ANODE is robust against systematic differences between signal region and sidebands, giving it broader applicability than other methods.
arXiv Detail & Related papers (2020-01-14T19:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.