Learning with Group Noise
- URL: http://arxiv.org/abs/2103.09468v1
- Date: Wed, 17 Mar 2021 06:57:10 GMT
- Title: Learning with Group Noise
- Authors: Qizhou Wang, Jiangchao Yao, Chen Gong, Tongliang Liu, Mingming Gong,
Hongxia Yang, and Bo Han
- Abstract summary: We propose a novel Max-Matching method for learning with group noise.
The performance on arange of real-world datasets in the area of several learning paradigms demonstrates the effectiveness of Max-Matching.
- Score: 106.56780716961732
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning in the context of noise is a challenging but practical
setting to plenty of real-world applications. Most of the previous approaches
in this area focus on the pairwise relation (casual or correlational
relationship) with noise, such as learning with noisy labels. However, the
group noise, which is parasitic on the coarse-grained accurate relation with
the fine-grained uncertainty, is also universal and has not been well
investigated. The challenge under this setting is how to discover true pairwise
connections concealed by the group relation with its fine-grained noise. To
overcome this issue, we propose a novel Max-Matching method for learning with
group noise. Specifically, it utilizes a matching mechanism to evaluate the
relation confidence of each object w.r.t. the target, meanwhile considering the
Non-IID characteristics among objects in the group. Only the most confident
object is considered to learn the model, so that the fine-grained noise is
mostly dropped. The performance on arange of real-world datasets in the area of
several learning paradigms demonstrates the effectiveness of Max-Matching
Related papers
- Disentangled Noisy Correspondence Learning [56.06801962154915]
Cross-modal retrieval is crucial in understanding latent correspondences across modalities.
DisNCL is a novel information-theoretic framework for feature Disentanglement in Noisy Correspondence Learning.
arXiv Detail & Related papers (2024-08-10T09:49:55Z) - Robust Learning under Hybrid Noise [24.36707245704713]
We propose a novel unified learning framework called "Feature and Label Recovery" (FLR) to combat the hybrid noise from the perspective of data recovery.
arXiv Detail & Related papers (2024-07-04T16:13:25Z) - Noisy Pair Corrector for Dense Retrieval [59.312376423104055]
We propose a novel approach called Noisy Pair Corrector (NPC)
NPC consists of a detection module and a correction module.
We conduct experiments on text-retrieval benchmarks Natural Question and TriviaQA, code-search benchmarks StaQC and SO-DS.
arXiv Detail & Related papers (2023-11-07T08:27:14Z) - Label Noise: Correcting the Forward-Correction [0.0]
Training neural network classifiers on datasets with label noise poses a risk of overfitting them to the noisy labels.
We propose an approach to tackling overfitting caused by label noise.
Motivated by this observation, we propose imposing a lower bound on the training loss to mitigate overfitting.
arXiv Detail & Related papers (2023-07-24T19:41:19Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - A Committee of Convolutional Neural Networks for Image Classication in
the Concurrent Presence of Feature and Label Noise [0.0]
This piece of research is the first attempt to address the problem of concurrent occurrence of both types of noise.
We experimentally proved that the difference by which committees beat single models increases along with noise level.
We propose three committee selection algorithms that outperform a strong baseline algorithm.
arXiv Detail & Related papers (2020-04-19T00:22:11Z) - Towards Noise-resistant Object Detection with Noisy Annotations [119.63458519946691]
Training deep object detectors requires significant amount of human-annotated images with accurate object labels and bounding box coordinates.
Noisy annotations are much more easily accessible, but they could be detrimental for learning.
We address the challenging problem of training object detectors with noisy annotations, where the noise contains a mixture of label noise and bounding box noise.
arXiv Detail & Related papers (2020-03-03T01:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.