Meta-tuning Loss Functions and Data Augmentation for Few-shot Object
Detection
- URL: http://arxiv.org/abs/2304.12161v1
- Date: Mon, 24 Apr 2023 15:14:16 GMT
- Title: Meta-tuning Loss Functions and Data Augmentation for Few-shot Object
Detection
- Authors: Berkan Demirel, Orhun Bu\u{g}ra Baran, Ramazan Gokberk Cinbis
- Abstract summary: Few-shot object detection is an emerging topic in the area of few-shot learning and object detection.
We propose a training scheme that allows learning inductive biases that can boost few-shot detection.
The proposed approach yields interpretable loss functions, as opposed to highly parametric and complex few-shot meta-models.
- Score: 7.262048441360132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot object detection, the problem of modelling novel object detection
categories with few training instances, is an emerging topic in the area of
few-shot learning and object detection. Contemporary techniques can be divided
into two groups: fine-tuning based and meta-learning based approaches. While
meta-learning approaches aim to learn dedicated meta-models for mapping samples
to novel class models, fine-tuning approaches tackle few-shot detection in a
simpler manner, by adapting the detection model to novel classes through
gradient based optimization. Despite their simplicity, fine-tuning based
approaches typically yield competitive detection results. Based on this
observation, we focus on the role of loss functions and augmentations as the
force driving the fine-tuning process, and propose to tune their dynamics
through meta-learning principles. The proposed training scheme, therefore,
allows learning inductive biases that can boost few-shot detection, while
keeping the advantages of fine-tuning based approaches. In addition, the
proposed approach yields interpretable loss functions, as opposed to highly
parametric and complex few-shot meta-models. The experimental results highlight
the merits of the proposed scheme, with significant improvements over the
strong fine-tuning based few-shot detection baselines on benchmark Pascal VOC
and MS-COCO datasets, in terms of both standard and generalized few-shot
performance metrics.
Related papers
- Fast Hierarchical Learning for Few-Shot Object Detection [57.024072600597464]
Transfer learning approaches have recently achieved promising results on the few-shot detection task.
These approaches suffer from catastrophic forgetting'' issue due to finetuning of base detector.
We tackle the aforementioned issues in this work.
arXiv Detail & Related papers (2022-10-10T20:31:19Z) - Adaptive Meta-learner via Gradient Similarity for Few-shot Text
Classification [11.035878821365149]
We propose a novel Adaptive Meta-learner via Gradient Similarity (AMGS) to improve the model generalization ability to a new task.
Experimental results on several benchmarks demonstrate that the proposed AMGS consistently improves few-shot text classification performance.
arXiv Detail & Related papers (2022-09-10T16:14:53Z) - Gradient-Based Meta-Learning Using Uncertainty to Weigh Loss for
Few-Shot Learning [5.691930884128995]
Model-Agnostic Meta-Learning (MAML) is one of the most successful meta-learning techniques for few-shot learning.
New method is proposed for task-specific learner adaptively learn to select parameters that minimize the loss of new tasks.
Method 1 generates weights by comparing meta-loss differences to improve the accuracy when there are few classes.
Method 2 introduces the homoscedastic uncertainty of each task to weigh multiple losses based on the original gradient descent.
arXiv Detail & Related papers (2022-08-17T08:11:51Z) - Meta-DETR: Image-Level Few-Shot Detection with Inter-Class Correlation
Exploitation [100.87407396364137]
We design Meta-DETR, which (i) is the first image-level few-shot detector, and (ii) introduces a novel inter-class correlational meta-learning strategy.
Experiments over multiple few-shot object detection benchmarks show that the proposed Meta-DETR outperforms state-of-the-art methods by large margins.
arXiv Detail & Related papers (2022-07-30T13:46:07Z) - Plug-and-Play Few-shot Object Detection with Meta Strategy and Explicit
Localization Inference [78.41932738265345]
This paper proposes a plug detector that can accurately detect the objects of novel categories without fine-tuning process.
We introduce two explicit inferences into the localization process to reduce its dependence on annotated data.
It shows a significant lead in both efficiency, precision, and recall under varied evaluation protocols.
arXiv Detail & Related papers (2021-10-26T03:09:57Z) - Few-shot Action Recognition with Prototype-centered Attentive Learning [88.10852114988829]
Prototype-centered Attentive Learning (PAL) model composed of two novel components.
First, a prototype-centered contrastive learning loss is introduced to complement the conventional query-centered learning objective.
Second, PAL integrates a attentive hybrid learning mechanism that can minimize the negative impacts of outliers.
arXiv Detail & Related papers (2021-01-20T11:48:12Z) - Few-shot Classification via Adaptive Attention [93.06105498633492]
We propose a novel few-shot learning method via optimizing and fast adapting the query sample representation based on very few reference samples.
As demonstrated experimentally, the proposed model achieves state-of-the-art classification results on various benchmark few-shot classification and fine-grained recognition datasets.
arXiv Detail & Related papers (2020-08-06T05:52:59Z) - One-Shot Object Detection without Fine-Tuning [62.39210447209698]
We introduce a two-stage model consisting of a first stage Matching-FCOS network and a second stage Structure-Aware Relation Module.
We also propose novel training strategies that effectively improve detection performance.
Our method exceeds the state-of-the-art one-shot performance consistently on multiple datasets.
arXiv Detail & Related papers (2020-05-08T01:59:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.