Co-Learning Meets Stitch-Up for Noisy Multi-label Visual Recognition
- URL: http://arxiv.org/abs/2307.00880v1
- Date: Mon, 3 Jul 2023 09:20:28 GMT
- Title: Co-Learning Meets Stitch-Up for Noisy Multi-label Visual Recognition
- Authors: Chao Liang, Zongxin Yang, Linchao Zhu, Yi Yang
- Abstract summary: This paper focuses on reducing noise based on some inherent properties of multi-label classification and long-tailed learning under noisy cases.
We propose a Stitch-Up augmentation to synthesize a cleaner sample, which directly reduces multi-label noise.
A Heterogeneous Co-Learning framework is further designed to leverage the inconsistency between long-tailed and balanced distributions.
- Score: 70.00984078351927
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In real-world scenarios, collected and annotated data often exhibit the
characteristics of multiple classes and long-tailed distribution. Additionally,
label noise is inevitable in large-scale annotations and hinders the
applications of learning-based models. Although many deep learning based
methods have been proposed for handling long-tailed multi-label recognition or
label noise respectively, learning with noisy labels in long-tailed multi-label
visual data has not been well-studied because of the complexity of long-tailed
distribution entangled with multi-label correlation. To tackle such a critical
yet thorny problem, this paper focuses on reducing noise based on some inherent
properties of multi-label classification and long-tailed learning under noisy
cases. In detail, we propose a Stitch-Up augmentation to synthesize a cleaner
sample, which directly reduces multi-label noise by stitching up multiple noisy
training samples. Equipped with Stitch-Up, a Heterogeneous Co-Learning
framework is further designed to leverage the inconsistency between long-tailed
and balanced distributions, yielding cleaner labels for more robust
representation learning with noisy long-tailed data. To validate our method, we
build two challenging benchmarks, named VOC-MLT-Noise and COCO-MLT-Noise,
respectively. Extensive experiments are conducted to demonstrate the
effectiveness of our proposed method. Compared to a variety of baselines, our
method achieves superior results.
Related papers
- Active Label Refinement for Robust Training of Imbalanced Medical Image Classification Tasks in the Presence of High Label Noise [10.232537737211098]
We propose a two-phase approach that combines Learning with Noisy Labels (LNL) and active learning.
We demonstrate that our proposed technique is superior to its predecessors at handling class imbalance by not misidentifying clean samples from minority classes as mostly noisy samples.
arXiv Detail & Related papers (2024-07-08T14:16:05Z) - Extracting Clean and Balanced Subset for Noisy Long-tailed Classification [66.47809135771698]
We develop a novel pseudo labeling method using class prototypes from the perspective of distribution matching.
By setting a manually-specific probability measure, we can reduce the side-effects of noisy and long-tailed data simultaneously.
Our method can extract this class-balanced subset with clean labels, which brings effective performance gains for long-tailed classification with label noise.
arXiv Detail & Related papers (2024-04-10T07:34:37Z) - Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Combating Noisy Labels in Long-Tailed Image Classification [33.40963778043824]
This paper makes an early effort to tackle the image classification task with both long-tailed distribution and label noise.
Existing noise-robust learning methods cannot work in this scenario as it is challenging to differentiate noisy samples from clean samples of tail classes.
We propose a new learning paradigm based on matching between inferences on weak and strong data augmentations to screen out noisy samples.
arXiv Detail & Related papers (2022-09-01T07:31:03Z) - Label-Noise Learning with Intrinsically Long-Tailed Data [65.41318436799993]
We propose a learning framework for label-noise learning with intrinsically long-tailed data.
Specifically, we propose two-stage bi-dimensional sample selection (TABASCO) to better separate clean samples from noisy samples.
arXiv Detail & Related papers (2022-08-21T07:47:05Z) - Robust Long-Tailed Learning under Label Noise [50.00837134041317]
This work investigates the label noise problem under long-tailed label distribution.
We propose a robust framework,algo, that realizes noise detection for long-tailed learning.
Our framework can naturally leverage semi-supervised learning algorithms to further improve the generalisation.
arXiv Detail & Related papers (2021-08-26T03:45:00Z) - Learning from Noisy Labels for Entity-Centric Information Extraction [17.50856935207308]
We propose a simple co-regularization framework for entity-centric information extraction.
These models are jointly optimized with task-specific loss, and are regularized to generate similar predictions.
In the end, we can take any of the trained models for inference.
arXiv Detail & Related papers (2021-04-17T22:49:12Z) - Attention-Aware Noisy Label Learning for Image Classification [97.26664962498887]
Deep convolutional neural networks (CNNs) learned on large-scale labeled samples have achieved remarkable progress in computer vision.
The cheapest way to obtain a large body of labeled visual data is to crawl from websites with user-supplied labels, such as Flickr.
This paper proposes the attention-aware noisy label learning approach to improve the discriminative capability of the network trained on datasets with potential label noise.
arXiv Detail & Related papers (2020-09-30T15:45:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.