More Reliable Pseudo-labels, Better Performance: A Generalized Approach to Single Positive Multi-label Learning
- URL: http://arxiv.org/abs/2508.20381v1
- Date: Thu, 28 Aug 2025 03:07:57 GMT
- Title: More Reliable Pseudo-labels, Better Performance: A Generalized Approach to Single Positive Multi-label Learning
- Authors: Luong Tran, Thieu Vo, Anh Nguyen, Sang Dinh, Van Nguyen,
- Abstract summary: Multi-label learning is a challenging computer vision task that requires assigning multiple categories to each image.<n>We propose a novel loss function that effectively learns from diverse pseudo-labels while mitigating noise.<n>Our framework significantly advances multi-label classification, achieving state-of-the-art results.
- Score: 12.20473357505283
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Multi-label learning is a challenging computer vision task that requires assigning multiple categories to each image. However, fully annotating large-scale datasets is often impractical due to high costs and effort, motivating the study of learning from partially annotated data. In the extreme case of Single Positive Multi-Label Learning (SPML), each image is provided with only one positive label, while all other labels remain unannotated. Traditional SPML methods that treat missing labels as unknown or negative tend to yield inaccuracies and false negatives, and integrating various pseudo-labeling strategies can introduce additional noise. To address these challenges, we propose the Generalized Pseudo-Label Robust Loss (GPR Loss), a novel loss function that effectively learns from diverse pseudo-labels while mitigating noise. Complementing this, we introduce a simple yet effective Dynamic Augmented Multi-focus Pseudo-labeling (DAMP) technique. Together, these contributions form the Adaptive and Efficient Vision-Language Pseudo-Labeling (AEVLP) framework. Extensive experiments on four benchmark datasets demonstrate that our framework significantly advances multi-label classification, achieving state-of-the-art results.
Related papers
- Multi-Label Contrastive Learning : A Comprehensive Study [48.81069245141415]
Multi-label classification has emerged as a key area in both research and industry.<n>Applying contrastive learning to multi-label classification presents unique challenges.<n>We conduct an in-depth study of contrastive learning loss for multi-label classification across diverse settings.
arXiv Detail & Related papers (2024-11-27T20:20:06Z) - Virtual Category Learning: A Semi-Supervised Learning Method for Dense
Prediction with Extremely Limited Labels [63.16824565919966]
This paper proposes to use confusing samples proactively without label correction.
A Virtual Category (VC) is assigned to each confusing sample in such a way that it can safely contribute to the model optimisation.
Our intriguing findings highlight the usage of VC learning in dense vision tasks.
arXiv Detail & Related papers (2023-12-02T16:23:52Z) - Vision-Language Pseudo-Labels for Single-Positive Multi-Label Learning [11.489541220229798]
In general multi-label learning, a model learns to predict multiple labels or categories for a single input image.
This is in contrast with standard multi-class image classification, where the task is predicting a single label from many possible labels for an image.
arXiv Detail & Related papers (2023-10-24T16:36:51Z) - Deep Partial Multi-Label Learning with Graph Disambiguation [27.908565535292723]
We propose a novel deep Partial multi-Label model with grAph-disambIguatioN (PLAIN)
Specifically, we introduce the instance-level and label-level similarities to recover label confidences.
At each training epoch, labels are propagated on the instance and label graphs to produce relatively accurate pseudo-labels.
arXiv Detail & Related papers (2023-05-10T04:02:08Z) - Class-Distribution-Aware Pseudo Labeling for Semi-Supervised Multi-Label
Learning [97.88458953075205]
Pseudo-labeling has emerged as a popular and effective approach for utilizing unlabeled data.
This paper proposes a novel solution called Class-Aware Pseudo-Labeling (CAP) that performs pseudo-labeling in a class-aware manner.
arXiv Detail & Related papers (2023-05-04T12:52:18Z) - Reliable Representation Learning for Incomplete Multi-View Missing Multi-Label Classification [78.15629210659516]
In this paper, we propose an incomplete multi-view missing multi-label classification network named RANK.<n>We break through the view-level weights inherent in existing methods and propose a quality-aware sub-network to dynamically assign quality scores to each view of each sample.<n>Our model is not only able to handle complete multi-view multi-label data, but also works on datasets with missing instances and labels.
arXiv Detail & Related papers (2023-03-30T03:09:25Z) - An Effective Approach for Multi-label Classification with Missing Labels [8.470008570115146]
We propose a pseudo-label based approach to reduce the cost of annotation without bringing additional complexity to the classification networks.
By designing a novel loss function, we are able to relax the requirement that each instance must contain at least one positive label.
We show that our method can handle the imbalance between positive labels and negative labels, while still outperforming existing missing-label learning approaches.
arXiv Detail & Related papers (2022-10-24T23:13:57Z) - One Positive Label is Sufficient: Single-Positive Multi-Label Learning
with Label Enhancement [71.9401831465908]
We investigate single-positive multi-label learning (SPMLL) where each example is annotated with only one relevant label.
A novel method named proposed, i.e., Single-positive MultI-label learning with Label Enhancement, is proposed.
Experiments on benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-06-01T14:26:30Z) - Acknowledging the Unknown for Multi-label Learning with Single Positive
Labels [65.5889334964149]
Traditionally, all unannotated labels are assumed as negative labels in single positive multi-label learning (SPML)
We propose entropy-maximization (EM) loss to maximize the entropy of predicted probabilities for all unannotated labels.
Considering the positive-negative label imbalance of unannotated labels, we propose asymmetric pseudo-labeling (APL) with asymmetric-tolerance strategies and a self-paced procedure to provide more precise supervision.
arXiv Detail & Related papers (2022-03-30T11:43:59Z) - Incomplete Multi-View Weak-Label Learning with Noisy Features and
Imbalanced Labels [4.800187500079582]
We propose a novel method to overcome the limitations of multi-view learning.
It embeds incomplete views and weak labels into a low-dimensional subspace with adaptive weights.
It adaptively learns view-wise importance for embedding to detect noisy views, and mitigates the label imbalance problem by focal loss.
arXiv Detail & Related papers (2022-01-04T10:49:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.