Multi-Complementary and Unlabeled Learning for Arbitrary Losses and
Models
- URL: http://arxiv.org/abs/2001.04243v3
- Date: Thu, 23 Jul 2020 17:18:42 GMT
- Title: Multi-Complementary and Unlabeled Learning for Arbitrary Losses and
Models
- Authors: Yuzhou Cao, Shuqi Liu and Yitian Xu
- Abstract summary: We propose a novel multi-complementary and unlabeled learning framework.
We first give an unbiased estimator of the classification risk from samples with multiple complementary labels.
We then further improve the estimator by incorporating unlabeled samples into the risk formulation.
- Score: 6.177038245239757
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A weakly-supervised learning framework named as complementary-label learning
has been proposed recently, where each sample is equipped with a single
complementary label that denotes one of the classes the sample does not belong
to. However, the existing complementary-label learning methods cannot learn
from the easily accessible unlabeled samples and samples with multiple
complementary labels, which are more informative. In this paper, to remove
these limitations, we propose the novel multi-complementary and unlabeled
learning framework that allows unbiased estimation of classification risk from
samples with any number of complementary labels and unlabeled samples, for
arbitrary loss functions and models. We first give an unbiased estimator of the
classification risk from samples with multiple complementary labels, and then
further improve the estimator by incorporating unlabeled samples into the risk
formulation. The estimation error bounds show that the proposed methods are in
the optimal parametric convergence rate. Finally, the experiments on both
linear and deep models show the effectiveness of our methods.
Related papers
- AllMatch: Exploiting All Unlabeled Data for Semi-Supervised Learning [5.0823084858349485]
We present a novel SSL algorithm named AllMatch, which achieves improved pseudo-label accuracy and a 100% utilization ratio for the unlabeled data.
The results demonstrate that AllMatch consistently outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-06-22T06:59:52Z) - Learning with Complementary Labels Revisited: The Selected-Completely-at-Random Setting Is More Practical [66.57396042747706]
Complementary-label learning is a weakly supervised learning problem.
We propose a consistent approach that does not rely on the uniform distribution assumption.
We find that complementary-label learning can be expressed as a set of negative-unlabeled binary classification problems.
arXiv Detail & Related papers (2023-11-27T02:59:17Z) - Class-Distribution-Aware Pseudo Labeling for Semi-Supervised Multi-Label
Learning [97.88458953075205]
Pseudo-labeling has emerged as a popular and effective approach for utilizing unlabeled data.
This paper proposes a novel solution called Class-Aware Pseudo-Labeling (CAP) that performs pseudo-labeling in a class-aware manner.
arXiv Detail & Related papers (2023-05-04T12:52:18Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - One Positive Label is Sufficient: Single-Positive Multi-Label Learning
with Label Enhancement [71.9401831465908]
We investigate single-positive multi-label learning (SPMLL) where each example is annotated with only one relevant label.
A novel method named proposed, i.e., Single-positive MultI-label learning with Label Enhancement, is proposed.
Experiments on benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-06-01T14:26:30Z) - Learning with Proper Partial Labels [87.65718705642819]
Partial-label learning is a kind of weakly-supervised learning with inexact labels.
We show that this proper partial-label learning framework includes many previous partial-label learning settings.
We then derive a unified unbiased estimator of the classification risk.
arXiv Detail & Related papers (2021-12-23T01:37:03Z) - Minimax Active Learning [61.729667575374606]
Active learning aims to develop label-efficient algorithms by querying the most representative samples to be labeled by a human annotator.
Current active learning techniques either rely on model uncertainty to select the most uncertain samples or use clustering or reconstruction to choose the most diverse set of unlabeled examples.
We develop a semi-supervised minimax entropy-based active learning algorithm that leverages both uncertainty and diversity in an adversarial manner.
arXiv Detail & Related papers (2020-12-18T19:03:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.