The Devil is in the Labels: Noisy Label Correction for Robust Scene
Graph Generation
- URL: http://arxiv.org/abs/2206.03014v1
- Date: Tue, 7 Jun 2022 05:03:57 GMT
- Title: The Devil is in the Labels: Noisy Label Correction for Robust Scene
Graph Generation
- Authors: Lin Li, Long Chen, Yifeng Huang, Zhimeng Zhang, Songyang Zhang, Jun
Xiao
- Abstract summary: We propose a novel model-agnostic NoIsy label CorrEction strategy for unbiased SGG models.
NICE can not only detect noisy samples but also reassign more high-quality predicate labels to them.
NICE consists of three components: negative Noisy Sample Detection (Neg-NSD), positive NSD (Pos-NSD), and Noisy Sample Correction (NSC)
- Score: 33.45310571580091
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unbiased SGG has achieved significant progress over recent years. However,
almost all existing SGG models have overlooked the ground-truth annotation
qualities of prevailing SGG datasets, i.e., they always assume: 1) all the
manually annotated positive samples are equally correct; 2) all the
un-annotated negative samples are absolutely background. In this paper, we
argue that both assumptions are inapplicable to SGG: there are numerous "noisy"
groundtruth predicate labels that break these two assumptions, and these noisy
samples actually harm the training of unbiased SGG models. To this end, we
propose a novel model-agnostic NoIsy label CorrEction strategy for SGG: NICE.
NICE can not only detect noisy samples but also reassign more high-quality
predicate labels to them. After the NICE training, we can obtain a cleaner
version of SGG dataset for model training. Specifically, NICE consists of three
components: negative Noisy Sample Detection (Neg-NSD), positive NSD (Pos-NSD),
and Noisy Sample Correction (NSC). Firstly, in Neg-NSD, we formulate this task
as an out-of-distribution detection problem, and assign pseudo labels to all
detected noisy negative samples. Then, in Pos-NSD, we use a clustering-based
algorithm to divide all positive samples into multiple sets, and treat the
samples in the noisiest set as noisy positive samples. Lastly, in NSC, we use a
simple but effective weighted KNN to reassign new predicate labels to noisy
positive samples. Extensive results on different backbones and tasks have
attested to the effectiveness and generalization abilities of each component of
NICE.
Related papers
- Augment and Criticize: Exploring Informative Samples for Semi-Supervised
Monocular 3D Object Detection [64.65563422852568]
We improve the challenging monocular 3D object detection problem with a general semi-supervised framework.
We introduce a novel, simple, yet effective Augment and Criticize' framework that explores abundant informative samples from unlabeled data.
The two new detectors, dubbed 3DSeMo_DLE and 3DSeMo_FLEX, achieve state-of-the-art results with remarkable improvements for over 3.5% AP_3D/BEV (Easy) on KITTI.
arXiv Detail & Related papers (2023-03-20T16:28:15Z) - Knockoffs-SPR: Clean Sample Selection in Learning with Noisy Labels [56.81761908354718]
We propose a novel theoretically guaranteed clean sample selection framework for learning with noisy labels.
Knockoffs-SPR can be regarded as a sample selection module for a standard supervised training pipeline.
We further combine it with a semi-supervised algorithm to exploit the support of noisy data as unlabeled data.
arXiv Detail & Related papers (2023-01-02T07:13:28Z) - Label Semantic Knowledge Distillation for Unbiased Scene Graph
Generation [34.20922091969159]
We propose a novel model-agnostic Label Semantic Knowledge Distillation (LS-KD) for unbiased Scene Graph Generation (SGG)
LS-KD dynamically generates a soft label for each subject-object instance by fusing a predicted Label Semantic Distribution (LSD) with its original one-hot target label.
arXiv Detail & Related papers (2022-08-07T16:19:19Z) - Neighborhood Collective Estimation for Noisy Label Identification and
Correction [92.20697827784426]
Learning with noisy labels (LNL) aims at designing strategies to improve model performance and generalization by mitigating the effects of model overfitting to noisy labels.
Recent advances employ the predicted label distributions of individual samples to perform noise verification and noisy label correction, easily giving rise to confirmation bias.
We propose Neighborhood Collective Estimation, in which the predictive reliability of a candidate sample is re-estimated by contrasting it against its feature-space nearest neighbors.
arXiv Detail & Related papers (2022-08-05T14:47:22Z) - NICEST: Noisy Label Correction and Training for Robust Scene Graph Generation [65.78472854070316]
We propose a novel NoIsy label CorrEction and Sample Training strategy for SGG: NICEST.
NICE first detects noisy samples and then reassigns them more high-quality soft predicate labels.
NICEST can be seamlessly incorporated into any SGG architecture to boost its performance on different predicate categories.
arXiv Detail & Related papers (2022-07-27T06:25:47Z) - Sample Prior Guided Robust Model Learning to Suppress Noisy Labels [8.119439844514973]
We propose PGDF, a novel framework to learn a deep model to suppress noise by generating the samples' prior knowledge.
Our framework can save more informative hard clean samples into the cleanly labeled set.
We evaluate our method using synthetic datasets based on CIFAR-10 and CIFAR-100, as well as on the real-world datasets WebVision and Clothing1M.
arXiv Detail & Related papers (2021-12-02T13:09:12Z) - An Ensemble Noise-Robust K-fold Cross-Validation Selection Method for
Noisy Labels [0.9699640804685629]
Large-scale datasets tend to contain mislabeled samples that can be memorized by deep neural networks (DNNs)
We present Ensemble Noise-robust K-fold Cross-Validation Selection (E-NKCVS) to effectively select clean samples from noisy data.
We evaluate our approach on various image and text classification tasks where the labels have been manually corrupted with different noise ratios.
arXiv Detail & Related papers (2021-07-06T02:14:52Z) - Jo-SRC: A Contrastive Approach for Combating Noisy Labels [58.867237220886885]
We propose a noise-robust approach named Jo-SRC (Joint Sample Selection and Model Regularization based on Consistency)
Specifically, we train the network in a contrastive learning manner. Predictions from two different views of each sample are used to estimate its "likelihood" of being clean or out-of-distribution.
arXiv Detail & Related papers (2021-03-24T07:26:07Z) - SCE: Scalable Network Embedding from Sparsest Cut [20.08464038805681]
Large-scale network embedding is to learn a latent representation for each node in an unsupervised manner.
A key of success to such contrastive learning methods is how to draw positive and negative samples.
In this paper, we propose SCE for unsupervised network embedding only using negative samples for training.
arXiv Detail & Related papers (2020-06-30T03:18:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.