Few-Shot Learning with a Strong Teacher
- URL: http://arxiv.org/abs/2107.00197v1
- Date: Thu, 1 Jul 2021 03:20:46 GMT
- Title: Few-Shot Learning with a Strong Teacher
- Authors: Han-Jia Ye, Lu Ming, De-Chuan Zhan, Wei-Lun Chao
- Abstract summary: Few-shot learning aims to train a strong classifier using limited labeled examples.
Many existing works take the meta-learning approach, sampling few-shot tasks in turn and optimizing the few-shot learner's performance on classifying the query examples.
We propose a novel objective to directly train the few-shot learner to perform like a strong classifier.
- Score: 36.35502703114652
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Few-shot learning (FSL) aims to train a strong classifier using limited
labeled examples. Many existing works take the meta-learning approach, sampling
few-shot tasks in turn and optimizing the few-shot learner's performance on
classifying the query examples. In this paper, we point out two potential
weaknesses of this approach. First, the sampled query examples may not provide
sufficient supervision for the few-shot learner. Second, the effectiveness of
meta-learning diminishes sharply with increasing shots (i.e., the number of
training examples per class). To resolve these issues, we propose a novel
objective to directly train the few-shot learner to perform like a strong
classifier. Concretely, we associate each sampled few-shot task with a strong
classifier, which is learned with ample labeled examples. The strong classifier
has a better generalization ability and we use it to supervise the few-shot
learner. We present an efficient way to construct the strong classifier, making
our proposed objective an easily plug-and-play term to existing meta-learning
based FSL methods. We validate our approach in combinations with many
representative meta-learning methods. On several benchmark datasets including
miniImageNet and tiredImageNet, our approach leads to a notable improvement
across a variety of tasks. More importantly, with our approach, meta-learning
based FSL methods can consistently outperform non-meta-learning based ones,
even in a many-shot setting, greatly strengthening their applicability.
Related papers
- Meta-Adapter: An Online Few-shot Learner for Vision-Language Model [64.21017759533474]
Contrastive vision-language pre-training, known as CLIP, demonstrates remarkable potential in perceiving open-world visual concepts.
Few-shot learning methods based on CLIP typically require offline fine-tuning of the parameters on few-shot samples.
We propose the Meta-Adapter, a lightweight residual-style adapter, to refine the CLIP features guided by the few-shot samples in an online manner.
arXiv Detail & Related papers (2023-11-07T07:27:16Z) - Meta Learning to Bridge Vision and Language Models for Multimodal
Few-Shot Learning [38.37682598345653]
We introduce a multimodal meta-learning approach to bridge the gap between vision and language models.
We define a meta-mapper network, acting as a meta-learner, to efficiently bridge frozen large-scale vision and language models.
We evaluate our approach on recently proposed multimodal few-shot benchmarks, measuring how rapidly the model can bind novel visual concepts to words.
arXiv Detail & Related papers (2023-02-28T17:46:18Z) - Multi-Modal Few-Shot Object Detection with Meta-Learning-Based
Cross-Modal Prompting [77.69172089359606]
We study multi-modal few-shot object detection (FSOD) in this paper, using both few-shot visual examples and class semantic information for detection.
Our approach is motivated by the high-level conceptual similarity of (metric-based) meta-learning and prompt-based learning.
We comprehensively evaluate the proposed multi-modal FSOD models on multiple few-shot object detection benchmarks, achieving promising results.
arXiv Detail & Related papers (2022-04-16T16:45:06Z) - Meta Navigator: Search for a Good Adaptation Policy for Few-shot
Learning [113.05118113697111]
Few-shot learning aims to adapt knowledge learned from previous tasks to novel tasks with only a limited amount of labeled data.
Research literature on few-shot learning exhibits great diversity, while different algorithms often excel at different few-shot learning scenarios.
We present Meta Navigator, a framework that attempts to solve the limitation in few-shot learning by seeking a higher-level strategy.
arXiv Detail & Related papers (2021-09-13T07:20:01Z) - Fast Few-Shot Classification by Few-Iteration Meta-Learning [173.32497326674775]
We introduce a fast optimization-based meta-learning method for few-shot classification.
Our strategy enables important aspects of the base learner objective to be learned during meta-training.
We perform a comprehensive experimental analysis, demonstrating the speed and effectiveness of our approach.
arXiv Detail & Related papers (2020-10-01T15:59:31Z) - Few-Shot Image Classification via Contrastive Self-Supervised Learning [5.878021051195956]
We propose a new paradigm of unsupervised few-shot learning to repair the deficiencies.
We solve the few-shot tasks in two phases: meta-training a transferable feature extractor via contrastive self-supervised learning.
Our method achieves state of-the-art performance in a variety of established few-shot tasks on the standard few-shot visual classification datasets.
arXiv Detail & Related papers (2020-08-23T02:24:31Z) - Learning to Learn to Disambiguate: Meta-Learning for Few-Shot Word Sense
Disambiguation [26.296412053816233]
We propose a meta-learning framework for few-shot word sense disambiguation.
The goal is to learn to disambiguate unseen words from only a few labeled instances.
We extend several popular meta-learning approaches to this scenario, and analyze their strengths and weaknesses.
arXiv Detail & Related papers (2020-04-29T17:33:31Z) - Rethinking Few-Shot Image Classification: a Good Embedding Is All You
Need? [72.00712736992618]
We show that a simple baseline: learning a supervised or self-supervised representation on the meta-training set, outperforms state-of-the-art few-shot learning methods.
An additional boost can be achieved through the use of self-distillation.
We believe that our findings motivate a rethinking of few-shot image classification benchmarks and the associated role of meta-learning algorithms.
arXiv Detail & Related papers (2020-03-25T17:58:42Z) - Meta-Baseline: Exploring Simple Meta-Learning for Few-Shot Learning [79.25478727351604]
We explore a simple process: meta-learning over a whole-classification pre-trained model on its evaluation metric.
We observe this simple method achieves competitive performance to state-of-the-art methods on standard benchmarks.
arXiv Detail & Related papers (2020-03-09T20:06:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.