Budget-aware Few-shot Learning via Graph Convolutional Network
- URL: http://arxiv.org/abs/2201.02304v1
- Date: Fri, 7 Jan 2022 02:46:35 GMT
- Title: Budget-aware Few-shot Learning via Graph Convolutional Network
- Authors: Shipeng Yan, Songyang Zhang, Xuming He
- Abstract summary: This paper tackles the problem of few-shot learning, which aims to learn new visual concepts from a few examples.
A common problem setting in few-shot classification assumes random sampling strategy in acquiring data labels.
We introduce a new budget-aware few-shot learning problem that aims to learn novel object categories.
- Score: 56.41899553037247
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper tackles the problem of few-shot learning, which aims to learn new
visual concepts from a few examples. A common problem setting in few-shot
classification assumes random sampling strategy in acquiring data labels, which
is inefficient in practical applications. In this work, we introduce a new
budget-aware few-shot learning problem that not only aims to learn novel object
categories, but also needs to select informative examples to annotate in order
to achieve data efficiency.
We develop a meta-learning strategy for our budget-aware few-shot learning
task, which jointly learns a novel data selection policy based on a Graph
Convolutional Network (GCN) and an example-based few-shot classifier. Our
selection policy computes a context-sensitive representation for each unlabeled
data by graph message passing, which is then used to predict an informativeness
score for sequential selection. We validate our method by extensive experiments
on the mini-ImageNet, tiered-ImageNet and Omniglot datasets. The results show
our few-shot learning strategy outperforms baselines by a sizable margin, which
demonstrates the efficacy of our method.
Related papers
- Cold PAWS: Unsupervised class discovery and addressing the cold-start
problem for semi-supervised learning [0.30458514384586394]
We propose a novel approach based on well-established self-supervised learning, clustering, and manifold learning techniques.
We test our approach using several publicly available datasets, namely CIFAR10, Imagenette, DeepWeeds, and EuroSAT.
We obtain superior performance for the datasets considered with a much simpler approach compared to other methods in the literature.
arXiv Detail & Related papers (2023-05-17T09:17:59Z) - A Simple Yet Effective Pretraining Strategy for Graph Few-shot Learning [38.66690010054665]
We propose a simple transductive fine-tuning based framework as a new paradigm for graph few-shot learning.
For pretraining, we propose a supervised contrastive learning framework with data augmentation strategies specific for few-shot node classification.
arXiv Detail & Related papers (2022-03-29T22:30:00Z) - A Simple Baseline for Low-Budget Active Learning [15.54250249254414]
We show that a simple k-means clustering algorithm can outperform state-of-the-art active learning methods on low budgets.
This method can be used as a simple baseline for low-budget active learning on image classification.
arXiv Detail & Related papers (2021-10-22T19:36:56Z) - Meta Navigator: Search for a Good Adaptation Policy for Few-shot
Learning [113.05118113697111]
Few-shot learning aims to adapt knowledge learned from previous tasks to novel tasks with only a limited amount of labeled data.
Research literature on few-shot learning exhibits great diversity, while different algorithms often excel at different few-shot learning scenarios.
We present Meta Navigator, a framework that attempts to solve the limitation in few-shot learning by seeking a higher-level strategy.
arXiv Detail & Related papers (2021-09-13T07:20:01Z) - Few-shot Weakly-Supervised Object Detection via Directional Statistics [55.97230224399744]
We propose a probabilistic multiple instance learning approach for few-shot Common Object Localization (COL) and few-shot Weakly Supervised Object Detection (WSOD)
Our model simultaneously learns the distribution of the novel objects and localizes them via expectation-maximization steps.
Our experiments show that the proposed method, despite being simple, outperforms strong baselines in few-shot COL and WSOD, as well as large-scale WSOD tasks.
arXiv Detail & Related papers (2021-03-25T22:34:16Z) - How to distribute data across tasks for meta-learning? [59.608652082495624]
We show that the optimal number of data points per task depends on the budget, but it converges to a unique constant value for large budgets.
Our results suggest a simple and efficient procedure for data collection.
arXiv Detail & Related papers (2021-03-15T15:38:47Z) - Region Comparison Network for Interpretable Few-shot Image
Classification [97.97902360117368]
Few-shot image classification has been proposed to effectively use only a limited number of labeled examples to train models for new classes.
We propose a metric learning based method named Region Comparison Network (RCN), which is able to reveal how few-shot learning works.
We also present a new way to generalize the interpretability from the level of tasks to categories.
arXiv Detail & Related papers (2020-09-08T07:29:05Z) - Few-shot Classification via Adaptive Attention [93.06105498633492]
We propose a novel few-shot learning method via optimizing and fast adapting the query sample representation based on very few reference samples.
As demonstrated experimentally, the proposed model achieves state-of-the-art classification results on various benchmark few-shot classification and fine-grained recognition datasets.
arXiv Detail & Related papers (2020-08-06T05:52:59Z) - Looking back to lower-level information in few-shot learning [4.873362301533825]
We propose the utilization of lower-level, supporting information, namely the feature embeddings of the hidden neural network layers, to improve classification accuracy.
Our experiments on two popular few-shot learning datasets, miniImageNet and tieredImageNet, show that our method can utilize the lower-level information in the network to improve state-of-the-art classification performance.
arXiv Detail & Related papers (2020-05-27T20:32:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.