GRSDet: Learning to Generate Local Reverse Samples for Few-shot Object
Detection
- URL: http://arxiv.org/abs/2312.16571v2
- Date: Fri, 29 Dec 2023 07:51:16 GMT
- Title: GRSDet: Learning to Generate Local Reverse Samples for Few-shot Object
Detection
- Authors: Hefei Mei, Taijin Zhao, Shiyuan Tang, Heqian Qiu, Lanxiao Wang,
Minjian Zhang, Fanman Meng, Hongliang Li
- Abstract summary: Few-shot object detection (FSOD) aims to achieve object detection only using a few novel class training data.
Most of the existing methods usually adopt a transfer-learning strategy to construct the novel class distribution.
We propose generating local reverse samples (LRSamples) in Prototype Reference Frames to adaptively adjust the center position and boundary range of the novel class distribution.
- Score: 15.998148904793426
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot object detection (FSOD) aims to achieve object detection only using
a few novel class training data. Most of the existing methods usually adopt a
transfer-learning strategy to construct the novel class distribution by
transferring the base class knowledge. However, this direct way easily results
in confusion between the novel class and other similar categories in the
decision space. To address the problem, we propose generating local reverse
samples (LRSamples) in Prototype Reference Frames to adaptively adjust the
center position and boundary range of the novel class distribution to learn
more discriminative novel class samples for FSOD. Firstly, we propose a Center
Calibration Variance Augmentation (CCVA) module, which contains the selection
rule of LRSamples, the generator of LRSamples, and augmentation on the
calibrated distribution centers. Specifically, we design an intra-class feature
converter (IFC) as the generator of CCVA to learn the selecting rule. By
transferring the knowledge of IFC from the base training to fine-tuning, the
IFC generates plentiful novel samples to calibrate the novel class
distribution. Moreover, we propose a Feature Density Boundary Optimization
(FDBO) module to adaptively adjust the importance of samples depending on their
distance from the decision boundary. It can emphasize the importance of the
high-density area of the similar class (closer decision boundary area) and
reduce the weight of the low-density area of the similar class (farther
decision boundary area), thus optimizing a clearer decision boundary for each
category. We conduct extensive experiments to demonstrate the effectiveness of
our proposed method. Our method achieves consistent improvement on the Pascal
VOC and MS COCO datasets based on DeFRCN and MFDC baselines.
Related papers
- Adaptive Margin Global Classifier for Exemplar-Free Class-Incremental Learning [3.4069627091757178]
Existing methods mainly focus on handling biased learning.
We introduce a Distribution-Based Global (DBGC) to avoid bias factors in existing methods, such as data imbalance and sampling.
More importantly, the compromised distributions of old classes are simulated via a simple operation, variance (VE).
This loss is proven equivalent to an Adaptive Margin Softmax Cross Entropy (AMarX)
arXiv Detail & Related papers (2024-09-20T07:07:23Z) - Rethinking Few-shot 3D Point Cloud Semantic Segmentation [62.80639841429669]
This paper revisits few-shot 3D point cloud semantic segmentation (FS-PCS)
We focus on two significant issues in the state-of-the-art: foreground leakage and sparse point distribution.
To address these issues, we introduce a standardized FS-PCS setting, upon which a new benchmark is built.
arXiv Detail & Related papers (2024-03-01T15:14:47Z) - Proposal Distribution Calibration for Few-Shot Object Detection [65.19808035019031]
In few-shot object detection (FSOD), the two-step training paradigm is widely adopted to mitigate the severe sample imbalance.
Unfortunately, the extreme data scarcity aggravates the proposal distribution bias, hindering the RoI head from evolving toward novel classes.
We introduce a simple yet effective proposal distribution calibration (PDC) approach to neatly enhance the localization and classification abilities of the RoI head.
arXiv Detail & Related papers (2022-12-15T05:09:11Z) - Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Prediction Calibration for Generalized Few-shot Semantic Segmentation [101.69940565204816]
Generalized Few-shot Semantic (GFSS) aims to segment each image pixel into either base classes with abundant training examples or novel classes with only a handful of (e.g., 1-5) training images per class.
We build a cross-attention module that guides the classifier's final prediction using the fused multi-level features.
Our PCN outperforms the state-the-art alternatives by large margins.
arXiv Detail & Related papers (2022-10-15T13:30:12Z) - Adaptive Distribution Calibration for Few-Shot Learning with
Hierarchical Optimal Transport [78.9167477093745]
We propose a novel distribution calibration method by learning the adaptive weight matrix between novel samples and base classes.
Experimental results on standard benchmarks demonstrate that our proposed plug-and-play model outperforms competing approaches.
arXiv Detail & Related papers (2022-10-09T02:32:57Z) - Learnable Distribution Calibration for Few-Shot Class-Incremental
Learning [122.2241120474278]
Few-shot class-incremental learning (FSCIL) faces challenges of memorizing old class distributions and estimating new class distributions given few training samples.
We propose a learnable distribution calibration (LDC) approach, with the aim to systematically solve these two challenges using a unified framework.
arXiv Detail & Related papers (2022-10-01T09:40:26Z) - Powering Finetuning in Few-shot Learning: Domain-Agnostic Feature
Adaptation with Rectified Class Prototypes [32.622613524622075]
Finetuning is designed to focus on reducing biases in novel-class feature distributions.
By powering finetuning with DCM and SS, we achieve state-of-the-art results on Meta-Dataset.
arXiv Detail & Related papers (2022-04-07T21:29:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.