Prompt-driven efficient Open-set Semi-supervised Learning
- URL: http://arxiv.org/abs/2209.14205v1
- Date: Wed, 28 Sep 2022 16:25:08 GMT
- Title: Prompt-driven efficient Open-set Semi-supervised Learning
- Authors: Haoran Li, Chun-Mei Feng, Tao Zhou, Yong Xu and Xiaojun Chang
- Abstract summary: Open-set semi-supervised learning (OSSL) has attracted growing interest, which investigates a more practical scenario where out-of-distribution (OOD) samples are only contained in unlabeled data.
We propose a prompt-driven efficient OSSL framework, called OpenPrompt, which can propagate class information from labeled to unlabeled data with only a small number of trainable parameters.
- Score: 52.30303262499391
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Open-set semi-supervised learning (OSSL) has attracted growing interest,
which investigates a more practical scenario where out-of-distribution (OOD)
samples are only contained in unlabeled data. Existing OSSL methods like
OpenMatch learn an OOD detector to identify outliers, which often update all
modal parameters (i.e., full fine-tuning) to propagate class information from
labeled data to unlabeled ones. Currently, prompt learning has been developed
to bridge gaps between pre-training and fine-tuning, which shows higher
computational efficiency in several downstream tasks. In this paper, we propose
a prompt-driven efficient OSSL framework, called OpenPrompt, which can
propagate class information from labeled to unlabeled data with only a small
number of trainable parameters. We propose a prompt-driven joint space learning
mechanism to detect OOD data by maximizing the distribution gap between ID and
OOD samples in unlabeled data, thereby our method enables the outliers to be
detected in a new way. The experimental results on three public datasets show
that OpenPrompt outperforms state-of-the-art methods with less than 1% of
trainable parameters. More importantly, OpenPrompt achieves a 4% improvement in
terms of AUROC on outlier detection over a fully supervised model on CIFAR10.
Related papers
- SCOMatch: Alleviating Overtrusting in Open-set Semi-supervised Learning [25.508200663171625]
Open-set semi-supervised learning (OSSL) uses practical open-set unlabeled data.
Prior OSSL methods suffer from the tendency to overtrust the labeled ID data.
We propose SCOMatch, a novel OSSL that treats OOD samples as an additional class, forming a new SSL process.
arXiv Detail & Related papers (2024-09-26T03:47:34Z) - Robust Semi-supervised Learning by Wisely Leveraging Open-set Data [48.67897991121204]
Open-set Semi-supervised Learning (OSSL) holds a realistic setting that unlabeled data may come from classes unseen in the labeled set.
We propose Wise Open-set Semi-supervised Learning (WiseOpen), a generic OSSL framework that selectively leverages the open-set data for training the model.
arXiv Detail & Related papers (2024-05-11T10:22:32Z) - LORD: Leveraging Open-Set Recognition with Unknown Data [10.200937444995944]
LORD is a framework to Leverage Open-set Recognition by exploiting unknown data.
We identify three model-agnostic training strategies that exploit background data and applied them to well-established classifiers.
arXiv Detail & Related papers (2023-08-24T06:12:41Z) - Towards General and Efficient Active Learning [20.888364610175987]
Active learning aims to select the most informative samples to exploit limited annotation budgets.
We propose a novel general and efficient active learning (GEAL) method in this paper.
Our method can conduct data selection processes on different datasets with a single-pass inference of the same model.
arXiv Detail & Related papers (2021-12-15T08:35:28Z) - Trash to Treasure: Harvesting OOD Data with Cross-Modal Matching for
Open-Set Semi-Supervised Learning [101.28281124670647]
Open-set semi-supervised learning (open-set SSL) investigates a challenging but practical scenario where out-of-distribution (OOD) samples are contained in the unlabeled data.
We propose a novel training mechanism that could effectively exploit the presence of OOD data for enhanced feature learning.
Our approach substantially lifts the performance on open-set SSL and outperforms the state-of-the-art by a large margin.
arXiv Detail & Related papers (2021-08-12T09:14:44Z) - OpenCoS: Contrastive Semi-supervised Learning for Handling Open-set
Unlabeled Data [65.19205979542305]
Unlabeled data may include out-of-class samples in practice.
OpenCoS is a method for handling this realistic semi-supervised learning scenario.
arXiv Detail & Related papers (2021-06-29T06:10:05Z) - OpenMatch: Open-set Consistency Regularization for Semi-supervised
Learning with Outliers [71.08167292329028]
We propose a novel Open-set Semi-Supervised Learning (OSSL) approach called OpenMatch.
OpenMatch unifies FixMatch with novelty detection based on one-vs-all (OVA) classifiers.
It achieves state-of-the-art performance on three datasets, and even outperforms a fully supervised model in detecting outliers unseen in unlabeled data on CIFAR10.
arXiv Detail & Related papers (2021-05-28T23:57:15Z) - Multi-Task Curriculum Framework for Open-Set Semi-Supervised Learning [54.85397562961903]
Semi-supervised learning (SSL) has been proposed to leverage unlabeled data for training powerful models when only limited labeled data is available.
We address a more complex novel scenario named open-set SSL, where out-of-distribution (OOD) samples are contained in unlabeled data.
Our method achieves state-of-the-art results by successfully eliminating the effect of OOD samples.
arXiv Detail & Related papers (2020-07-22T10:33:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.