Unlocking the Power of Open Set : A New Perspective for Open-Set Noisy
Label Learning
- URL: http://arxiv.org/abs/2305.04203v2
- Date: Fri, 23 Feb 2024 08:55:08 GMT
- Title: Unlocking the Power of Open Set : A New Perspective for Open-Set Noisy
Label Learning
- Authors: Wenhai Wan, Xinrui Wang, Ming-Kun Xie, Shao-Yuan Li, Sheng-Jun Huang,
Songcan Chen
- Abstract summary: We propose a novel two-step contrastive learning method to deal with both types of label noise.
Specifically, we incorporate some open-set examples into closed-set classes to enhance performance.
- Score: 58.4201336276109
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning from noisy data has attracted much attention, where most methods
focus on closed-set label noise. However, a more common scenario in the real
world is the presence of both open-set and closed-set noise. Existing methods
typically identify and handle these two types of label noise separately by
designing a specific strategy for each type. However, in many real-world
scenarios, it would be challenging to identify open-set examples, especially
when the dataset has been severely corrupted. Unlike the previous works, we
explore how models behave when faced with open-set examples, and find that
\emph{a part of open-set examples gradually get integrated into certain known
classes}, which is beneficial for the separation among known classes. Motivated
by the phenomenon, we propose a novel two-step contrastive learning method CECL
(Class Expansion Contrastive Learning) which aims to deal with both types of
label noise by exploiting the useful information of open-set examples.
Specifically, we incorporate some open-set examples into closed-set classes to
enhance performance while treating others as delimiters to improve
representative ability. Extensive experiments on synthetic and real-world
datasets with diverse label noise demonstrate the effectiveness of CECL.
Related papers
- Dirichlet-Based Coarse-to-Fine Example Selection For Open-Set Annotation [37.33424244520009]
We propose a Dirichlet-based Coarse-to-Fine Example Selection (DCFS) strategy.
Our method introduces simplex-based evidential deep learning (EDL) to break translation invariance.
Experiments on various openness ratio datasets demonstrate that DCFS achieves state-of-art performance.
arXiv Detail & Related papers (2024-09-26T07:47:50Z) - Unleashing the Potential of Open-set Noisy Samples Against Label Noise for Medical Image Classification [45.319828759068415]
We propose the Extended Noise-robust Contrastive and Open-set Feature Augmentation framework for medical image classification tasks.
This framework incorporates the Extended Noise-robust Supervised Contrastive Loss, which helps differentiate features among both in-distribution and out-of-distribution classes.
We also develop the Open-set Feature Augmentation module that enriches open-set samples at the feature level and then assigns them dynamic class labels.
arXiv Detail & Related papers (2024-06-18T05:54:28Z) - Open-Set Facial Expression Recognition [42.62439125553367]
Facial expression recognition (FER) models are typically trained on datasets with a fixed number of seven basic classes.
Recent research works point out that there are far more expressions than the basic ones.
We propose the open-set FER task for the first time.
arXiv Detail & Related papers (2024-01-23T05:57:50Z) - Learning with Neighbor Consistency for Noisy Labels [69.83857578836769]
We present a method for learning from noisy labels that leverages similarities between training examples in feature space.
We evaluate our method on datasets evaluating both synthetic (CIFAR-10, CIFAR-100) and realistic (mini-WebVision, Clothing1M, mini-ImageNet-Red) noise.
arXiv Detail & Related papers (2022-02-04T15:46:27Z) - Active Learning for Open-set Annotation [38.739845944840454]
We propose a new active learning framework called LfOSA, which boosts the classification performance with an effective sampling strategy to precisely detect examples from known classes for annotation.
The experimental results show that the proposed method can significantly improve the selection quality of known classes, and achieve higher classification accuracy with lower annotation cost than state-of-the-art active learning methods.
arXiv Detail & Related papers (2022-01-18T06:11:51Z) - Open-Set Representation Learning through Combinatorial Embedding [62.05670732352456]
We are interested in identifying novel concepts in a dataset through representation learning based on the examples in both labeled and unlabeled classes.
We propose a learning approach, which naturally clusters examples in unseen classes using the compositional knowledge given by multiple supervised meta-classifiers on heterogeneous label spaces.
The proposed algorithm discovers novel concepts via a joint optimization of enhancing the discrimitiveness of unseen classes as well as learning the representations of known classes generalizable to novel ones.
arXiv Detail & Related papers (2021-06-29T11:51:57Z) - OpenCoS: Contrastive Semi-supervised Learning for Handling Open-set
Unlabeled Data [65.19205979542305]
Unlabeled data may include out-of-class samples in practice.
OpenCoS is a method for handling this realistic semi-supervised learning scenario.
arXiv Detail & Related papers (2021-06-29T06:10:05Z) - Approximating Instance-Dependent Noise via Instance-Confidence Embedding [87.65718705642819]
Label noise in multiclass classification is a major obstacle to the deployment of learning systems.
We investigate the instance-dependent noise (IDN) model and propose an efficient approximation of IDN to capture the instance-specific label corruption.
arXiv Detail & Related papers (2021-03-25T02:33:30Z) - EvidentialMix: Learning with Combined Open-set and Closed-set Noisy
Labels [30.268962418683955]
We study a new variant of the noisy label problem that combines the open-set and closed-set noisy labels.
Our results show that our method produces superior classification results and better feature representations than previous state-of-the-art methods.
arXiv Detail & Related papers (2020-11-11T11:15:32Z) - Attention-Aware Noisy Label Learning for Image Classification [97.26664962498887]
Deep convolutional neural networks (CNNs) learned on large-scale labeled samples have achieved remarkable progress in computer vision.
The cheapest way to obtain a large body of labeled visual data is to crawl from websites with user-supplied labels, such as Flickr.
This paper proposes the attention-aware noisy label learning approach to improve the discriminative capability of the network trained on datasets with potential label noise.
arXiv Detail & Related papers (2020-09-30T15:45:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.