OOD Aware Supervised Contrastive Learning
- URL: http://arxiv.org/abs/2310.01942v1
- Date: Tue, 3 Oct 2023 10:38:39 GMT
- Title: OOD Aware Supervised Contrastive Learning
- Authors: Soroush Seifi, Daniel Olmeda Reino, Nikolay Chumerin, Rahaf Aljundi
- Abstract summary: Out-of-Distribution (OOD) detection is a crucial problem for the safe deployment of machine learning models.
We leverage powerful representation learned with Supervised Contrastive (SupCon) training and propose a holistic approach to learn a robust to OOD data.
Our solution is simple and efficient and acts as a natural extension of the closed-set supervised contrastive representation learning.
- Score: 13.329080722482187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Out-of-Distribution (OOD) detection is a crucial problem for the safe
deployment of machine learning models identifying samples that fall outside of
the training distribution, i.e. in-distribution data (ID). Most OOD works focus
on the classification models trained with Cross Entropy (CE) and attempt to fix
its inherent issues. In this work we leverage powerful representation learned
with Supervised Contrastive (SupCon) training and propose a holistic approach
to learn a classifier robust to OOD data. We extend SupCon loss with two
additional contrast terms. The first term pushes auxiliary OOD representations
away from ID representations without imposing any constraints on similarities
among auxiliary data. The second term pushes OOD features far from the existing
class prototypes, while pushing ID representations closer to their
corresponding class prototype. When auxiliary OOD data is not available, we
propose feature mixing techniques to efficiently generate pseudo-OOD features.
Our solution is simple and efficient and acts as a natural extension of the
closed-set supervised contrastive representation learning. We compare against
different OOD detection methods on the common benchmarks and show
state-of-the-art results.
Related papers
- Can OOD Object Detectors Learn from Foundation Models? [56.03404530594071]
Out-of-distribution (OOD) object detection is a challenging task due to the absence of open-set OOD data.
Inspired by recent advancements in text-to-image generative models, we study the potential of generative models trained on large-scale open-set data to synthesize OOD samples.
We introduce SyncOOD, a simple data curation method that capitalizes on the capabilities of large foundation models.
arXiv Detail & Related papers (2024-09-08T17:28:22Z) - WeiPer: OOD Detection using Weight Perturbations of Class Projections [11.130659240045544]
We introduce perturbations of the class projections in the final fully connected layer which creates a richer representation of the input.
We achieve state-of-the-art OOD detection results across multiple benchmarks of the OpenOOD framework.
arXiv Detail & Related papers (2024-05-27T13:38:28Z) - EAT: Towards Long-Tailed Out-of-Distribution Detection [55.380390767978554]
This paper addresses the challenging task of long-tailed OOD detection.
The main difficulty lies in distinguishing OOD data from samples belonging to the tail classes.
We propose two simple ideas: (1) Expanding the in-distribution class space by introducing multiple abstention classes, and (2) Augmenting the context-limited tail classes by overlaying images onto the context-rich OOD data.
arXiv Detail & Related papers (2023-12-14T13:47:13Z) - Out-of-distribution Detection Learning with Unreliable
Out-of-distribution Sources [73.28967478098107]
Out-of-distribution (OOD) detection discerns OOD data where the predictor cannot make valid predictions as in-distribution (ID) data.
It is typically hard to collect real out-of-distribution (OOD) data for training a predictor capable of discerning OOD patterns.
We propose a data generation-based learning method named Auxiliary Task-based OOD Learning (ATOL) that can relieve the mistaken OOD generation.
arXiv Detail & Related papers (2023-11-06T16:26:52Z) - From Global to Local: Multi-scale Out-of-distribution Detection [129.37607313927458]
Out-of-distribution (OOD) detection aims to detect "unknown" data whose labels have not been seen during the in-distribution (ID) training process.
Recent progress in representation learning gives rise to distance-based OOD detection.
We propose Multi-scale OOD DEtection (MODE), a first framework leveraging both global visual information and local region details.
arXiv Detail & Related papers (2023-08-20T11:56:25Z) - General-Purpose Multi-Modal OOD Detection Framework [5.287829685181842]
Out-of-distribution (OOD) detection identifies test samples that differ from the training data, which is critical to ensuring the safety and reliability of machine learning (ML) systems.
We propose a general-purpose weakly-supervised OOD detection framework, called WOOD, that combines a binary classifier and a contrastive learning component.
We evaluate the proposed WOOD model on multiple real-world datasets, and the experimental results demonstrate that the WOOD model outperforms the state-of-the-art methods for multi-modal OOD detection.
arXiv Detail & Related papers (2023-07-24T18:50:49Z) - How Does Fine-Tuning Impact Out-of-Distribution Detection for Vision-Language Models? [29.75562085178755]
We study how fine-tuning impact OOD detection for few-shot downstream tasks.
Our results suggest that a proper choice of OOD scores is essential for CLIP-based fine-tuning.
We also show that prompt learning demonstrates the state-of-the-art OOD detection performance over the zero-shot counterpart.
arXiv Detail & Related papers (2023-06-09T17:16:50Z) - Out-of-distribution Detection with Implicit Outlier Transformation [72.73711947366377]
Outlier exposure (OE) is powerful in out-of-distribution (OOD) detection.
We propose a novel OE-based approach that makes the model perform well for unseen OOD situations.
arXiv Detail & Related papers (2023-03-09T04:36:38Z) - Rethinking Out-of-distribution (OOD) Detection: Masked Image Modeling is
All You Need [52.88953913542445]
We find surprisingly that simply using reconstruction-based methods could boost the performance of OOD detection significantly.
We take Masked Image Modeling as a pretext task for our OOD detection framework (MOOD)
arXiv Detail & Related papers (2023-02-06T08:24:41Z) - OODformer: Out-Of-Distribution Detection Transformer [15.17006322500865]
In real-world safety-critical applications, it is important to be aware if a new data point is OOD.
This paper proposes a first-of-its-kind OOD detection architecture named OODformer.
arXiv Detail & Related papers (2021-07-19T15:46:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.