Pseudo Labels Regularization for Imbalanced Partial-Label Learning
- URL: http://arxiv.org/abs/2303.03946v1
- Date: Mon, 6 Mar 2023 15:21:55 GMT
- Title: Pseudo Labels Regularization for Imbalanced Partial-Label Learning
- Authors: Mingyu Xu, Zheng Lian
- Abstract summary: The challenge of partial-label learning and long-tail learning lies in matching between a decent marginal prior distribution with drawing the pseudo labels.
By punishing the pseudo labels of head classes, our method implements state-of-art under the standardized benchmarks.
- Score: 11.93067260471484
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial-label learning (PLL) is an important branch of weakly supervised
learning where the single ground truth resides in a set of candidate labels,
while the research rarely considers the label imbalance. A recent study for
imbalanced partial-Label learning proposed that the combinatorial challenge of
partial-label learning and long-tail learning lies in matching between a decent
marginal prior distribution with drawing the pseudo labels. However, we believe
that even if the pseudo label matches the prior distribution, the tail classes
will still be difficult to learn because the total weight is too small.
Therefore, we propose a pseudo-label regularization technique specially
designed for PLL. By punishing the pseudo labels of head classes, our method
implements state-of-art under the standardized benchmarks compared to the
previous PLL methods.
Related papers
- Pseudo-labelling meets Label Smoothing for Noisy Partial Label Learning [8.387189407144403]
Partial label learning (PLL) is a weakly-supervised learning paradigm where each training instance is paired with a set of candidate labels (partial label)
NPLL relaxes this constraint by allowing some partial labels to not contain the true label, enhancing the practicality of the problem.
We present a minimalistic framework that initially assigns pseudo-labels to images by exploiting the noisy partial labels through a weighted nearest neighbour algorithm.
arXiv Detail & Related papers (2024-02-07T13:32:47Z) - Appeal: Allow Mislabeled Samples the Chance to be Rectified in Partial Label Learning [55.4510979153023]
In partial label learning (PLL), each instance is associated with a set of candidate labels among which only one is ground-truth.
To help these mislabeled samples "appeal," we propose the first appeal-based framework.
arXiv Detail & Related papers (2023-12-18T09:09:52Z) - Complementary Classifier Induced Partial Label Learning [54.61668156386079]
In partial label learning (PLL), each training sample is associated with a set of candidate labels, among which only one is valid.
In disambiguation, the existing works usually do not fully investigate the effectiveness of the non-candidate label set.
In this paper, we use the non-candidate labels to induce a complementary classifier, which naturally forms an adversarial relationship against the traditional classifier.
arXiv Detail & Related papers (2023-05-17T02:13:23Z) - Class-Distribution-Aware Pseudo Labeling for Semi-Supervised Multi-Label
Learning [97.88458953075205]
Pseudo-labeling has emerged as a popular and effective approach for utilizing unlabeled data.
This paper proposes a novel solution called Class-Aware Pseudo-Labeling (CAP) that performs pseudo-labeling in a class-aware manner.
arXiv Detail & Related papers (2023-05-04T12:52:18Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - SoLar: Sinkhorn Label Refinery for Imbalanced Partial-Label Learning [31.535219018410707]
Partial-label learning (PLL) is a peculiar weakly-supervised learning task where the training samples are generally associated with a set of candidate labels instead of single ground truth.
We propose SoLar, a novel framework that allows refine the disambiguated labels towards matching the marginal class prior distribution.
SoLar exhibits substantially superior results on standardized benchmarks compared to the previous state-the-art methods.
arXiv Detail & Related papers (2022-09-21T14:00:16Z) - Instance-Dependent Partial Label Learning [69.49681837908511]
Partial label learning is a typical weakly supervised learning problem.
Most existing approaches assume that the incorrect labels in each training example are randomly picked as the candidate labels.
In this paper, we consider instance-dependent and assume that each example is associated with a latent label distribution constituted by the real number of each label.
arXiv Detail & Related papers (2021-10-25T12:50:26Z) - Distribution-Aware Semantics-Oriented Pseudo-label for Imbalanced
Semi-Supervised Learning [80.05441565830726]
This paper addresses imbalanced semi-supervised learning, where heavily biased pseudo-labels can harm the model performance.
We propose a general pseudo-labeling framework to address the bias motivated by this observation.
We term the novel pseudo-labeling framework for imbalanced SSL as Distribution-Aware Semantics-Oriented (DASO) Pseudo-label.
arXiv Detail & Related papers (2021-06-10T11:58:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.