Particle Competition and Cooperation for Semi-Supervised Learning with
Label Noise
- URL: http://arxiv.org/abs/2002.05198v1
- Date: Wed, 12 Feb 2020 19:44:59 GMT
- Title: Particle Competition and Cooperation for Semi-Supervised Learning with
Label Noise
- Authors: Fabricio Aparecido Breve, Liang Zhao, Marcos Gon\c{c}alves Quiles
- Abstract summary: A graph-based semi-supervised learning approach based on Particle competition and cooperation was developed.
This paper presents a new particle competition and cooperation algorithm, specifically designed to increase the robustness to the presence of label noise.
It performs classification of unlabeled nodes and reclassification of the nodes affected by label noise in a unique process.
- Score: 6.247917165799351
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Semi-supervised learning methods are usually employed in the classification
of data sets where only a small subset of the data items is labeled. In these
scenarios, label noise is a crucial issue, since the noise may easily spread to
a large portion or even the entire data set, leading to major degradation in
classification accuracy. Therefore, the development of new techniques to reduce
the nasty effects of label noise in semi-supervised learning is a vital issue.
Recently, a graph-based semi-supervised learning approach based on Particle
competition and cooperation was developed. In this model, particles walk in the
graphs constructed from the data sets. Competition takes place among particles
representing different class labels, while the cooperation occurs among
particles with the same label. This paper presents a new particle competition
and cooperation algorithm, specifically designed to increase the robustness to
the presence of label noise, improving its label noise tolerance. Different
from other methods, the proposed one does not require a separate technique to
deal with label noise. It performs classification of unlabeled nodes and
reclassification of the nodes affected by label noise in a unique process.
Computer simulations show the classification accuracy of the proposed method
when applied to some artificial and real-world data sets, in which we introduce
increasing amounts of label noise. The classification accuracy is compared to
those achieved by previous particle competition and cooperation algorithms and
other representative graph-based semi-supervised learning methods using the
same scenarios. Results show the effectiveness of the proposed method.
Related papers
- Extracting Clean and Balanced Subset for Noisy Long-tailed Classification [66.47809135771698]
We develop a novel pseudo labeling method using class prototypes from the perspective of distribution matching.
By setting a manually-specific probability measure, we can reduce the side-effects of noisy and long-tailed data simultaneously.
Our method can extract this class-balanced subset with clean labels, which brings effective performance gains for long-tailed classification with label noise.
arXiv Detail & Related papers (2024-04-10T07:34:37Z) - Group Benefits Instances Selection for Data Purification [21.977432359384835]
Existing methods for combating label noise are typically designed and tested on synthetic datasets.
We propose a method named GRIP to alleviate the noisy label problem for both synthetic and real-world datasets.
arXiv Detail & Related papers (2024-03-23T03:06:19Z) - Rethinking Noisy Label Learning in Real-world Annotation Scenarios from
the Noise-type Perspective [38.24239397999152]
We propose a novel sample selection-based approach for noisy label learning, called Proto-semi.
Proto-semi divides all samples into the confident and unconfident datasets via warm-up.
By leveraging the confident dataset, prototype vectors are constructed to capture class characteristics.
Empirical evaluations on a real-world annotated dataset substantiate the robustness of Proto-semi in handling the problem of learning from noisy labels.
arXiv Detail & Related papers (2023-07-28T10:57:38Z) - Robust Product Classification with Instance-Dependent Noise [2.0661025590877777]
Noisy labels in large E-commerce product data (i.e., product items are placed into incorrect categories) are a critical issue for product categorization task.
We study the impact of instance-dependent noise to performance of product title classification.
arXiv Detail & Related papers (2022-09-14T21:45:14Z) - Learning with Neighbor Consistency for Noisy Labels [69.83857578836769]
We present a method for learning from noisy labels that leverages similarities between training examples in feature space.
We evaluate our method on datasets evaluating both synthetic (CIFAR-10, CIFAR-100) and realistic (mini-WebVision, Clothing1M, mini-ImageNet-Red) noise.
arXiv Detail & Related papers (2022-02-04T15:46:27Z) - A Second-Order Approach to Learning with Instance-Dependent Label Noise [58.555527517928596]
The presence of label noise often misleads the training of deep neural networks.
We show that the errors in human-annotated labels are more likely to be dependent on the difficulty levels of tasks.
arXiv Detail & Related papers (2020-12-22T06:36:58Z) - EvidentialMix: Learning with Combined Open-set and Closed-set Noisy
Labels [30.268962418683955]
We study a new variant of the noisy label problem that combines the open-set and closed-set noisy labels.
Our results show that our method produces superior classification results and better feature representations than previous state-of-the-art methods.
arXiv Detail & Related papers (2020-11-11T11:15:32Z) - Class2Simi: A Noise Reduction Perspective on Learning with Noisy Labels [98.13491369929798]
We propose a framework called Class2Simi, which transforms data points with noisy class labels to data pairs with noisy similarity labels.
Class2Simi is computationally efficient because not only this transformation is on-the-fly in mini-batches, but also it just changes loss on top of model prediction into a pairwise manner.
arXiv Detail & Related papers (2020-06-14T07:55:32Z) - Towards Noise-resistant Object Detection with Noisy Annotations [119.63458519946691]
Training deep object detectors requires significant amount of human-annotated images with accurate object labels and bounding box coordinates.
Noisy annotations are much more easily accessible, but they could be detrimental for learning.
We address the challenging problem of training object detectors with noisy annotations, where the noise contains a mixture of label noise and bounding box noise.
arXiv Detail & Related papers (2020-03-03T01:32:16Z) - Multi-Class Classification from Noisy-Similarity-Labeled Data [98.13491369929798]
We propose a method for learning from only noisy-similarity-labeled data.
We use a noise transition matrix to bridge the class-posterior probability between clean and noisy data.
We build a novel learning system which can assign noise-free class labels for instances.
arXiv Detail & Related papers (2020-02-16T05:10:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.