Density of States Estimation for Out-of-Distribution Detection
- URL: http://arxiv.org/abs/2006.09273v2
- Date: Mon, 22 Jun 2020 18:03:00 GMT
- Title: Density of States Estimation for Out-of-Distribution Detection
- Authors: Warren R. Morningstar, Cusuh Ham, Andrew G. Gallagher, Balaji
Lakshminarayanan, Alexander A. Alemi, Joshua V. Dillon
- Abstract summary: DoSE is the density of states estimator.
We demonstrate DoSE's state-of-the-art performance against other unsupervised OOD detectors.
- Score: 69.90130863160384
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Perhaps surprisingly, recent studies have shown probabilistic model
likelihoods have poor specificity for out-of-distribution (OOD) detection and
often assign higher likelihoods to OOD data than in-distribution data. To
ameliorate this issue we propose DoSE, the density of states estimator. Drawing
on the statistical physics notion of ``density of states,'' the DoSE decision
rule avoids direct comparison of model probabilities, and instead utilizes the
``probability of the model probability,'' or indeed the frequency of any
reasonable statistic. The frequency is calculated using nonparametric density
estimators (e.g., KDE and one-class SVM) which measure the typicality of
various model statistics given the training data and from which we can flag
test points with low typicality as anomalous. Unlike many other methods, DoSE
requires neither labeled data nor OOD examples. DoSE is modular and can be
trivially applied to any existing, trained model. We demonstrate DoSE's
state-of-the-art performance against other unsupervised OOD detectors on
previously established ``hard'' benchmarks.
Related papers
- PQMass: Probabilistic Assessment of the Quality of Generative Models
using Probability Mass Estimation [8.527898482146103]
We propose a comprehensive sample-based method for assessing the quality of generative models.
The proposed approach enables the estimation of the probability that two sets of samples are drawn from the same distribution.
arXiv Detail & Related papers (2024-02-06T19:39:26Z) - Out-of-distribution Object Detection through Bayesian Uncertainty
Estimation [10.985423935142832]
We propose a novel, intuitive, and scalable probabilistic object detection method for OOD detection.
Our method is able to distinguish between in-distribution (ID) data and OOD data via weight parameter sampling from proposed Gaussian distributions.
We demonstrate that our Bayesian object detector can achieve satisfactory OOD identification performance by reducing the FPR95 score by up to 8.19% and increasing the AUROC score by up to 13.94% when trained on BDD100k and VOC datasets.
arXiv Detail & Related papers (2023-10-29T19:10:52Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Learning Robust Statistics for Simulation-based Inference under Model
Misspecification [23.331522354991527]
We propose the first general approach to handle model misspecification that works across different classes of simulation-based inference methods.
We show that our method yields robust inference in misspecified scenarios, whilst still being accurate when the model is well-specified.
arXiv Detail & Related papers (2023-05-25T09:06:26Z) - Uncertainty Modeling for Out-of-Distribution Generalization [56.957731893992495]
We argue that the feature statistics can be properly manipulated to improve the generalization ability of deep learning models.
Common methods often consider the feature statistics as deterministic values measured from the learned features.
We improve the network generalization ability by modeling the uncertainty of domain shifts with synthesized feature statistics during training.
arXiv Detail & Related papers (2022-02-08T16:09:12Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Feature Shift Detection: Localizing Which Features Have Shifted via
Conditional Distribution Tests [12.468665026043382]
In military sensor networks, users will want to detect when one or more of the sensors has been compromised.
We first define a formalization of this problem as multiple conditional distribution hypothesis tests.
For both efficiency and flexibility, we propose a test statistic based on the density model score function.
arXiv Detail & Related papers (2021-07-14T18:23:24Z) - Robust Out-of-Distribution Detection on Deep Probabilistic Generative
Models [0.06372261626436676]
Out-of-distribution (OOD) detection is an important task in machine learning systems.
Deep probabilistic generative models facilitate OOD detection by estimating the likelihood of a data sample.
We propose a new detection metric that operates without outlier exposure.
arXiv Detail & Related papers (2021-06-15T06:36:10Z) - Learn what you can't learn: Regularized Ensembles for Transductive
Out-of-distribution Detection [76.39067237772286]
We show that current out-of-distribution (OOD) detection algorithms for neural networks produce unsatisfactory results in a variety of OOD detection scenarios.
This paper studies how such "hard" OOD scenarios can benefit from adjusting the detection method after observing a batch of the test data.
We propose a novel method that uses an artificial labeling scheme for the test data and regularization to obtain ensembles of models that produce contradictory predictions only on the OOD samples in a test batch.
arXiv Detail & Related papers (2020-12-10T16:55:13Z) - Tracking disease outbreaks from sparse data with Bayesian inference [55.82986443159948]
The COVID-19 pandemic provides new motivation for estimating the empirical rate of transmission during an outbreak.
Standard methods struggle to accommodate the partial observability and sparse data common at finer scales.
We propose a Bayesian framework which accommodates partial observability in a principled manner.
arXiv Detail & Related papers (2020-09-12T20:37:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.