Out-of-distribution Detection by Cross-class Vicinity Distribution of
In-distribution Data
- URL: http://arxiv.org/abs/2206.09385v2
- Date: Sun, 27 Aug 2023 07:22:07 GMT
- Title: Out-of-distribution Detection by Cross-class Vicinity Distribution of
In-distribution Data
- Authors: Zhilin Zhao and Longbing Cao and Kun-Yu Lin
- Abstract summary: Deep neural networks for image classification only learn to map in-distribution inputs to their corresponding ground truth labels in training.
This results from the assumption that all samples are independent and identically distributed.
A textitCross-class Vicinity Distribution is introduced by assuming that an out-of-distribution sample generated by mixing multiple in-distribution samples does not share the same classes of its constituents.
- Score: 36.66825830101456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks for image classification only learn to map
in-distribution inputs to their corresponding ground truth labels in training
without differentiating out-of-distribution samples from in-distribution ones.
This results from the assumption that all samples are independent and
identically distributed (IID) without distributional distinction. Therefore, a
pretrained network learned from in-distribution samples treats
out-of-distribution samples as in-distribution and makes high-confidence
predictions on them in the test phase. To address this issue, we draw
out-of-distribution samples from the vicinity distribution of training
in-distribution samples for learning to reject the prediction on
out-of-distribution inputs. A \textit{Cross-class Vicinity Distribution} is
introduced by assuming that an out-of-distribution sample generated by mixing
multiple in-distribution samples does not share the same classes of its
constituents. We thus improve the discriminability of a pretrained network by
finetuning it with out-of-distribution samples drawn from the cross-class
vicinity distribution, where each out-of-distribution input corresponds to a
complementary label. Experiments on various in-/out-of-distribution datasets
show that the proposed method significantly outperforms the existing methods in
improving the capacity of discriminating between in- and out-of-distribution
samples.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Uncertainty Quantification via Stable Distribution Propagation [60.065272548502]
We propose a new approach for propagating stable probability distributions through neural networks.
Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity.
arXiv Detail & Related papers (2024-02-13T09:40:19Z) - Probabilistic Matching of Real and Generated Data Statistics in Generative Adversarial Networks [0.6906005491572401]
We propose a method to ensure that the distributions of certain generated data statistics coincide with the respective distributions of the real data.
We evaluate the method on a synthetic dataset and a real-world dataset and demonstrate improved performance of our approach.
arXiv Detail & Related papers (2023-06-19T14:03:27Z) - Distribution Shift Inversion for Out-of-Distribution Prediction [57.22301285120695]
We propose a portable Distribution Shift Inversion algorithm for Out-of-Distribution (OoD) prediction.
We show that our method provides a general performance gain when plugged into a wide range of commonly used OoD algorithms.
arXiv Detail & Related papers (2023-06-14T08:00:49Z) - Towards In-distribution Compatibility in Out-of-distribution Detection [30.49191281345763]
We propose a new out-of-distribution detection method by adapting both the top-design of deep models and the loss function.
Our method achieves the state-of-the-art out-of-distribution detection performance but also improves the in-distribution accuracy.
arXiv Detail & Related papers (2022-08-29T09:06:15Z) - Robust Calibration with Multi-domain Temperature Scaling [86.07299013396059]
We develop a systematic calibration model to handle distribution shifts by leveraging data from multiple domains.
Our proposed method -- multi-domain temperature scaling -- uses the robustness in the domains to improve calibration under distribution shift.
arXiv Detail & Related papers (2022-06-06T17:32:12Z) - Unrolling Particles: Unsupervised Learning of Sampling Distributions [102.72972137287728]
Particle filtering is used to compute good nonlinear estimates of complex systems.
We show in simulations that the resulting particle filter yields good estimates in a wide range of scenarios.
arXiv Detail & Related papers (2021-10-06T16:58:34Z) - Multi-Class Data Description for Out-of-distribution Detection [25.853322158250435]
Deep-MCDD is effective to detect out-of-distribution (OOD) samples as well as classify in-distribution (ID) samples.
By integrating the concept of Gaussian discriminant analysis into deep neural networks, we propose a deep learning objective to learn class-conditional distributions.
arXiv Detail & Related papers (2021-04-02T08:41:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.