Learning from Stochastic Labels
- URL: http://arxiv.org/abs/2302.00299v1
- Date: Wed, 1 Feb 2023 08:04:27 GMT
- Title: Learning from Stochastic Labels
- Authors: Meng Wei, Zhongnian Li, Yong Zhou, Qiaoyu Guo, Xinzheng Xu
- Abstract summary: Annotating multi-class instances is a crucial task in the field of machine learning.
In this paper, we propose a novel suitable approach to learn from these labels.
- Score: 8.178975818137937
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Annotating multi-class instances is a crucial task in the field of machine
learning. Unfortunately, identifying the correct class label from a long
sequence of candidate labels is time-consuming and laborious. To alleviate this
problem, we design a novel labeling mechanism called stochastic label. In this
setting, stochastic label includes two cases: 1) identify a correct class label
from a small number of randomly given labels; 2) annotate the instance with
None label when given labels do not contain correct class label. In this paper,
we propose a novel suitable approach to learn from these stochastic labels. We
obtain an unbiased estimator that utilizes less supervised information in
stochastic labels to train a multi-class classifier. Additionally, it is
theoretically justifiable by deriving the estimation error bound of the
proposed method. Finally, we conduct extensive experiments on widely-used
benchmark datasets to validate the superiority of our method by comparing it
with existing state-of-the-art methods.
Related papers
- Improving Multi-Label Contrastive Learning by Leveraging Label Distribution [13.276821681189166]
In multi-label learning, leveraging contrastive learning to learn better representations faces a key challenge: selecting positive and negative samples.
Previous studies selected positive and negative samples based on the overlap between labels and used them for label-wise loss balancing.
We propose a novel method that improves multi-label contrastive learning through label distribution.
arXiv Detail & Related papers (2025-01-31T14:00:02Z) - Mixed Blessing: Class-Wise Embedding guided Instance-Dependent Partial Label Learning [53.64180787439527]
In partial label learning (PLL), every sample is associated with a candidate label set comprising the ground-truth label and several noisy labels.
For the first time, we create class-wise embeddings for each sample, which allow us to explore the relationship of instance-dependent noisy labels.
To reduce the high label ambiguity, we introduce the concept of class prototypes containing global feature information.
arXiv Detail & Related papers (2024-12-06T13:25:39Z) - Determined Multi-Label Learning via Similarity-Based Prompt [12.428779617221366]
In multi-label classification, each training instance is associated with multiple class labels simultaneously.
To alleviate this problem, a novel labeling setting termed textitDetermined Multi-Label Learning (DMLL) is proposed.
arXiv Detail & Related papers (2024-03-25T07:08:01Z) - Towards Imbalanced Large Scale Multi-label Classification with Partially
Annotated Labels [8.977819892091]
Multi-label classification is a widely encountered problem in daily life, where an instance can be associated with multiple classes.
In this work, we address the issue of label imbalance and investigate how to train neural networks using partial labels.
arXiv Detail & Related papers (2023-07-31T21:50:48Z) - Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label Configurations [91.67511167969934]
imprecise label learning (ILL) is a framework for the unification of learning with various imprecise label configurations.
We demonstrate that ILL can seamlessly adapt to partial label learning, semi-supervised learning, noisy label learning, and, more importantly, a mixture of these settings.
arXiv Detail & Related papers (2023-05-22T04:50:28Z) - Class-Distribution-Aware Pseudo Labeling for Semi-Supervised Multi-Label
Learning [97.88458953075205]
Pseudo-labeling has emerged as a popular and effective approach for utilizing unlabeled data.
This paper proposes a novel solution called Class-Aware Pseudo-Labeling (CAP) that performs pseudo-labeling in a class-aware manner.
arXiv Detail & Related papers (2023-05-04T12:52:18Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - Acknowledging the Unknown for Multi-label Learning with Single Positive
Labels [65.5889334964149]
Traditionally, all unannotated labels are assumed as negative labels in single positive multi-label learning (SPML)
We propose entropy-maximization (EM) loss to maximize the entropy of predicted probabilities for all unannotated labels.
Considering the positive-negative label imbalance of unannotated labels, we propose asymmetric pseudo-labeling (APL) with asymmetric-tolerance strategies and a self-paced procedure to provide more precise supervision.
arXiv Detail & Related papers (2022-03-30T11:43:59Z) - All Labels Are Not Created Equal: Enhancing Semi-supervision via Label
Grouping and Co-training [32.45488147013166]
Pseudo-labeling is a key component in semi-supervised learning (SSL)
We propose SemCo, a method which leverages label semantics and co-training to address this problem.
We show that our method achieves state-of-the-art performance across various SSL tasks including 5.6% accuracy improvement on Mini-ImageNet dataset with 1000 labeled examples.
arXiv Detail & Related papers (2021-04-12T07:33:16Z) - A Study on the Autoregressive and non-Autoregressive Multi-label
Learning [77.11075863067131]
We propose a self-attention based variational encoder-model to extract the label-label and label-feature dependencies jointly.
Our model can therefore be used to predict all labels in parallel while still including both label-label and label-feature dependencies.
arXiv Detail & Related papers (2020-12-03T05:41:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.