Hierarchical Attention Network for Few-Shot Object Detection via
Meta-Contrastive Learning
- URL: http://arxiv.org/abs/2208.07039v2
- Date: Tue, 16 Aug 2022 07:48:27 GMT
- Title: Hierarchical Attention Network for Few-Shot Object Detection via
Meta-Contrastive Learning
- Authors: Dongwoo Park, Jong-Min Lee
- Abstract summary: Few-shot object detection (FSOD) aims to classify and detect few images of novel categories.
We propose a hierarchical attention network with sequentially large receptive fields to fully exploit the query and support images.
Our method brings 2.3, 1.0, 1.3, 3.4 and 2.4% AP improvements for 1-30 shots object detection on COCO dataset.
- Score: 4.952681349410351
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Few-shot object detection (FSOD) aims to classify and detect few images of
novel categories. Existing meta-learning methods insufficiently exploit
features between support and query images owing to structural limitations. We
propose a hierarchical attention network with sequentially large receptive
fields to fully exploit the query and support images. In addition,
meta-learning does not distinguish the categories well because it determines
whether the support and query images match. In other words, metric-based
learning for classification is ineffective because it does not work directly.
Thus, we propose a contrastive learning method called meta-contrastive
learning, which directly helps achieve the purpose of the meta-learning
strategy. Finally, we establish a new state-of-the-art network, by realizing
significant margins. Our method brings 2.3, 1.0, 1.3, 3.4 and 2.4% AP
improvements for 1-30 shots object detection on COCO dataset. Our code is
available at: https://github.com/infinity7428/hANMCL
Related papers
- Class Anchor Margin Loss for Content-Based Image Retrieval [97.81742911657497]
We propose a novel repeller-attractor loss that falls in the metric learning paradigm, yet directly optimize for the L2 metric without the need of generating pairs.
We evaluate the proposed objective in the context of few-shot and full-set training on the CBIR task, by using both convolutional and transformer architectures.
arXiv Detail & Related papers (2023-06-01T12:53:10Z) - Meta-DETR: Image-Level Few-Shot Detection with Inter-Class Correlation
Exploitation [100.87407396364137]
We design Meta-DETR, which (i) is the first image-level few-shot detector, and (ii) introduces a novel inter-class correlational meta-learning strategy.
Experiments over multiple few-shot object detection benchmarks show that the proposed Meta-DETR outperforms state-of-the-art methods by large margins.
arXiv Detail & Related papers (2022-07-30T13:46:07Z) - Dynamic Relevance Learning for Few-Shot Object Detection [6.550840743803705]
We propose a dynamic relevance learning model, which utilizes the relationship between all support images and Region of Interest (RoI) on the query images to construct a dynamic graph convolutional network (GCN)
The proposed model achieves the best overall performance, which shows its effectiveness of learning more generalized features.
arXiv Detail & Related papers (2021-08-04T18:29:42Z) - Meta-DETR: Few-Shot Object Detection via Unified Image-Level
Meta-Learning [39.50529982746885]
Few-shot object detection aims at detecting novel objects with only a few annotated examples.
This paper presents a novel meta-detector framework, namely Meta-DETR, which eliminates region-wise prediction.
It instead meta-learns object localization and classification at image level in a unified and complementary manner.
arXiv Detail & Related papers (2021-03-22T11:14:00Z) - DetCo: Unsupervised Contrastive Learning for Object Detection [64.22416613061888]
Unsupervised contrastive learning achieves great success in learning image representations with CNN.
We present a novel contrastive learning approach, named DetCo, which fully explores the contrasts between global image and local image patches.
DetCo consistently outperforms supervised method by 1.6/1.2/1.0 AP on Mask RCNN-C4/FPN/RetinaNet with 1x schedule.
arXiv Detail & Related papers (2021-02-09T12:47:20Z) - Learning to Focus: Cascaded Feature Matching Network for Few-shot Image
Recognition [38.49419948988415]
Deep networks can learn to accurately recognize objects of a category by training on a large number of images.
A meta-learning challenge known as a low-shot image recognition task comes when only a few images with annotations are available for learning a recognition model for one category.
Our method, called Cascaded Feature Matching Network (CFMN), is proposed to solve this problem.
Experiments for few-shot learning on two standard datasets, emphminiImageNet and Omniglot, have confirmed the effectiveness of our method.
arXiv Detail & Related papers (2021-01-13T11:37:28Z) - SCAN: Learning to Classify Images without Labels [73.69513783788622]
We advocate a two-step approach where feature learning and clustering are decoupled.
A self-supervised task from representation learning is employed to obtain semantically meaningful features.
We obtain promising results on ImageNet, and outperform several semi-supervised learning methods in the low-data regime.
arXiv Detail & Related papers (2020-05-25T18:12:33Z) - Rethinking Few-Shot Image Classification: a Good Embedding Is All You
Need? [72.00712736992618]
We show that a simple baseline: learning a supervised or self-supervised representation on the meta-training set, outperforms state-of-the-art few-shot learning methods.
An additional boost can be achieved through the use of self-distillation.
We believe that our findings motivate a rethinking of few-shot image classification benchmarks and the associated role of meta-learning algorithms.
arXiv Detail & Related papers (2020-03-25T17:58:42Z) - Meta-Baseline: Exploring Simple Meta-Learning for Few-Shot Learning [79.25478727351604]
We explore a simple process: meta-learning over a whole-classification pre-trained model on its evaluation metric.
We observe this simple method achieves competitive performance to state-of-the-art methods on standard benchmarks.
arXiv Detail & Related papers (2020-03-09T20:06:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.