Learning advisor networks for noisy image classification
- URL: http://arxiv.org/abs/2211.04177v1
- Date: Tue, 8 Nov 2022 11:44:08 GMT
- Title: Learning advisor networks for noisy image classification
- Authors: Simone Ricci, Tiberio Uricchio, Alberto Del Bimbo
- Abstract summary: We introduce the novel concept of advisor network to address the problem of noisy labels in image classification.
We trained it with a meta-learning strategy so that it can adapt throughout the training of the main model.
We tested our method on CIFAR10 and CIFAR100 with synthetic noise, and on Clothing1M which contains real-world noise, reporting state-of-the-art results.
- Score: 22.77447144331876
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we introduced the novel concept of advisor network to address
the problem of noisy labels in image classification. Deep neural networks (DNN)
are prone to performance reduction and overfitting problems on training data
with noisy annotations. Weighting loss methods aim to mitigate the influence of
noisy labels during the training, completely removing their contribution. This
discarding process prevents DNNs from learning wrong associations between
images and their correct labels but reduces the amount of data used, especially
when most of the samples have noisy labels. Differently, our method weighs the
feature extracted directly from the classifier without altering the loss value
of each data. The advisor helps to focus only on some part of the information
present in mislabeled examples, allowing the classifier to leverage that data
as well. We trained it with a meta-learning strategy so that it can adapt
throughout the training of the main model. We tested our method on CIFAR10 and
CIFAR100 with synthetic noise, and on Clothing1M which contains real-world
noise, reporting state-of-the-art results.
Related papers
- ERASE: Error-Resilient Representation Learning on Graphs for Label Noise
Tolerance [53.73316938815873]
We propose a method called ERASE (Error-Resilient representation learning on graphs for lAbel noiSe tolerancE) to learn representations with error tolerance.
ERASE combines prototype pseudo-labels with propagated denoised labels and updates representations with error resilience.
Our method can outperform multiple baselines with clear margins in broad noise levels and enjoy great scalability.
arXiv Detail & Related papers (2023-12-13T17:59:07Z) - Combating Label Noise With A General Surrogate Model For Sample
Selection [84.61367781175984]
We propose to leverage the vision-language surrogate model CLIP to filter noisy samples automatically.
We validate the effectiveness of our proposed method on both real-world and synthetic noisy datasets.
arXiv Detail & Related papers (2023-10-16T14:43:27Z) - Blind Knowledge Distillation for Robust Image Classification [19.668440671541546]
Blind Knowledge Distillation is a teacher-student approach for learning with noisy labels.
We use Otsus algorithm to estimate the tipping point from generalizing to overfitting.
We show in our experiments that Blind Knowledge Distillation detects overfitting effectively during training.
arXiv Detail & Related papers (2022-11-21T11:17:07Z) - Synergistic Network Learning and Label Correction for Noise-robust Image
Classification [28.27739181560233]
Deep Neural Networks (DNNs) tend to overfit training label noise, resulting in poorer model performance in practice.
We propose a robust label correction framework combining the ideas of small loss selection and noise correction.
We demonstrate our method on both synthetic and real-world datasets with different noise types and rates.
arXiv Detail & Related papers (2022-02-27T23:06:31Z) - Noisy Labels Can Induce Good Representations [53.47668632785373]
We study how architecture affects learning with noisy labels.
We show that training with noisy labels can induce useful hidden representations, even when the model generalizes poorly.
This finding leads to a simple method to improve models trained on noisy labels.
arXiv Detail & Related papers (2020-12-23T18:58:05Z) - Attention-Aware Noisy Label Learning for Image Classification [97.26664962498887]
Deep convolutional neural networks (CNNs) learned on large-scale labeled samples have achieved remarkable progress in computer vision.
The cheapest way to obtain a large body of labeled visual data is to crawl from websites with user-supplied labels, such as Flickr.
This paper proposes the attention-aware noisy label learning approach to improve the discriminative capability of the network trained on datasets with potential label noise.
arXiv Detail & Related papers (2020-09-30T15:45:36Z) - Learning from Noisy Labels with Noise Modeling Network [7.523041606515877]
Noise Modeling Network (NMN) follows our convolutional neural network (CNN) and integrates with it.
NMN learns the distribution of noise patterns directly from the noisy data.
We show that the integrated NMN/CNN learning system consistently improves the classification performance.
arXiv Detail & Related papers (2020-05-01T20:32:22Z) - Improving Generalization by Controlling Label-Noise Information in
Neural Network Weights [33.85101318266319]
In the presence of noisy or incorrect labels, neural networks have the undesirable tendency to memorize information about the noise.
Standard regularization techniques such as dropout, weight decay or data augmentation sometimes help, but do not prevent this behavior.
We show that for any training algorithm, low values of this term correspond to reduction in memorization of label-noise and better bounds.
arXiv Detail & Related papers (2020-02-19T00:08:30Z) - DivideMix: Learning with Noisy Labels as Semi-supervised Learning [111.03364864022261]
We propose DivideMix, a framework for learning with noisy labels.
Experiments on multiple benchmark datasets demonstrate substantial improvements over state-of-the-art methods.
arXiv Detail & Related papers (2020-02-18T06:20:06Z) - Learning with Out-of-Distribution Data for Audio Classification [60.48251022280506]
We show that detecting and relabelling certain OOD instances, rather than discarding them, can have a positive effect on learning.
The proposed method is shown to improve the performance of convolutional neural networks by a significant margin.
arXiv Detail & Related papers (2020-02-11T21:08:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.