Self-Supervised Prototypical Transfer Learning for Few-Shot
Classification
- URL: http://arxiv.org/abs/2006.11325v1
- Date: Fri, 19 Jun 2020 19:00:11 GMT
- Title: Self-Supervised Prototypical Transfer Learning for Few-Shot
Classification
- Authors: Carlos Medina, Arnout Devos, Matthias Grossglauser
- Abstract summary: Self-supervised transfer learning approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learning methods on few-shot tasks.
In few-shot experiments with domain shift, our approach even has comparable performance to supervised methods, but requires orders of magnitude fewer labels.
- Score: 11.96734018295146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most approaches in few-shot learning rely on costly annotated data related to
the goal task domain during (pre-)training. Recently, unsupervised
meta-learning methods have exchanged the annotation requirement for a reduction
in few-shot classification performance. Simultaneously, in settings with
realistic domain shift, common transfer learning has been shown to outperform
supervised meta-learning. Building on these insights and on advances in
self-supervised learning, we propose a transfer learning approach which
constructs a metric embedding that clusters unlabeled prototypical samples and
their augmentations closely together. This pre-trained embedding is a starting
point for few-shot classification by summarizing class clusters and
fine-tuning. We demonstrate that our self-supervised prototypical transfer
learning approach ProtoTransfer outperforms state-of-the-art unsupervised
meta-learning methods on few-shot tasks from the mini-ImageNet dataset. In
few-shot experiments with domain shift, our approach even has comparable
performance to supervised methods, but requires orders of magnitude fewer
labels.
Related papers
- GenCo: An Auxiliary Generator from Contrastive Learning for Enhanced
Few-Shot Learning in Remote Sensing [9.504503675097137]
We introduce a generator-based contrastive learning framework (GenCo) that pre-trains backbones and simultaneously explores variants of feature samples.
In fine-tuning, the auxiliary generator can be used to enrich limited labeled data samples in feature space.
We demonstrate the effectiveness of our method in improving few-shot learning performance on two key remote sensing datasets.
arXiv Detail & Related papers (2023-07-27T03:59:19Z) - Unsupervised Meta-Learning via Few-shot Pseudo-supervised Contrastive
Learning [72.3506897990639]
We propose a simple yet effective unsupervised meta-learning framework, coined Pseudo-supervised Contrast (PsCo) for few-shot classification.
PsCo outperforms existing unsupervised meta-learning methods under various in-domain and cross-domain few-shot classification benchmarks.
arXiv Detail & Related papers (2023-03-02T06:10:13Z) - Simple Control Baselines for Evaluating Transfer Learning [1.0499611180329802]
We share an evaluation standard that aims to quantify and communicate transfer learning performance.
We provide an example empirical study investigating a few basic questions about self-supervised learning.
arXiv Detail & Related papers (2022-02-07T17:26:26Z) - Trainable Class Prototypes for Few-Shot Learning [5.481942307939029]
We propose the trainable prototypes for distance measure instead of the artificial ones within the meta-training and task-training framework.
Also to avoid the disadvantages that the episodic meta-training brought, we adopt non-episodic meta-training based on self-supervised learning.
Our method achieves state-of-the-art performance in a variety of established few-shot tasks on the standard few-shot visual classification dataset.
arXiv Detail & Related papers (2021-06-21T04:19:56Z) - Few-shot Action Recognition with Prototype-centered Attentive Learning [88.10852114988829]
Prototype-centered Attentive Learning (PAL) model composed of two novel components.
First, a prototype-centered contrastive learning loss is introduced to complement the conventional query-centered learning objective.
Second, PAL integrates a attentive hybrid learning mechanism that can minimize the negative impacts of outliers.
arXiv Detail & Related papers (2021-01-20T11:48:12Z) - A Survey on Contrastive Self-supervised Learning [0.0]
Self-supervised learning has gained popularity because of its ability to avoid the cost of annotating large-scale datasets.
Contrastive learning has recently become a dominant component in self-supervised learning methods for computer vision, natural language processing (NLP), and other domains.
This paper provides an extensive review of self-supervised methods that follow the contrastive approach.
arXiv Detail & Related papers (2020-10-31T21:05:04Z) - Unsupervised Transfer Learning for Spatiotemporal Predictive Networks [90.67309545798224]
We study how to transfer knowledge from a zoo of unsupervisedly learned models towards another network.
Our motivation is that models are expected to understand complex dynamics from different sources.
Our approach yields significant improvements on three benchmarks fortemporal prediction, and benefits the target even from less relevant ones.
arXiv Detail & Related papers (2020-09-24T15:40:55Z) - Towards Cross-Granularity Few-Shot Learning: Coarse-to-Fine
Pseudo-Labeling with Visual-Semantic Meta-Embedding [13.063136901934865]
Few-shot learning aims at rapidly adapting to novel categories with only a handful of samples at test time.
In this paper, we advance the few-shot classification paradigm towards a more challenging scenario, i.e., cross-granularity few-shot classification.
We approximate the fine-grained data distribution by greedy clustering of each coarse-class into pseudo-fine-classes according to the similarity of image embeddings.
arXiv Detail & Related papers (2020-07-11T03:44:21Z) - UniT: Unified Knowledge Transfer for Any-shot Object Detection and
Segmentation [52.487469544343305]
Methods for object detection and segmentation rely on large scale instance-level annotations for training.
We propose an intuitive and unified semi-supervised model that is applicable to a range of supervision.
arXiv Detail & Related papers (2020-06-12T22:45:47Z) - Learning Diverse Representations for Fast Adaptation to Distribution
Shift [78.83747601814669]
We present a method for learning multiple models, incorporating an objective that pressures each to learn a distinct way to solve the task.
We demonstrate our framework's ability to facilitate rapid adaptation to distribution shift.
arXiv Detail & Related papers (2020-06-12T12:23:50Z) - Prototypical Contrastive Learning of Unsupervised Representations [171.3046900127166]
Prototypical Contrastive Learning (PCL) is an unsupervised representation learning method.
PCL implicitly encodes semantic structures of the data into the learned embedding space.
PCL outperforms state-of-the-art instance-wise contrastive learning methods on multiple benchmarks.
arXiv Detail & Related papers (2020-05-11T09:53:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.