MR-GDINO: Efficient Open-World Continual Object Detection
- URL: http://arxiv.org/abs/2412.15979v2
- Date: Mon, 23 Dec 2024 16:55:57 GMT
- Title: MR-GDINO: Efficient Open-World Continual Object Detection
- Authors: Bowen Dong, Zitong Huang, Guanglei Yang, Lei Zhang, Wangmeng Zuo,
- Abstract summary: We propose an open-world continual object detection task requiring detectors to generalize to old, new, and unseen categories.
We present a challenging yet practical OW-COD benchmark to assess detection abilities.
To mitigate forgetting in unseen categories, we propose MR-GDINO, a strong, efficient and scalable baseline via memory and retrieval mechanisms.
- Score: 58.066277387205325
- License:
- Abstract: Open-world (OW) recognition and detection models show strong zero- and few-shot adaptation abilities, inspiring their use as initializations in continual learning methods to improve performance. Despite promising results on seen classes, such OW abilities on unseen classes are largely degenerated due to catastrophic forgetting. To tackle this challenge, we propose an open-world continual object detection task, requiring detectors to generalize to old, new, and unseen categories in continual learning scenarios. Based on this task, we present a challenging yet practical OW-COD benchmark to assess detection abilities. The goal is to motivate OW detectors to simultaneously preserve learned classes, adapt to new classes, and maintain open-world capabilities under few-shot adaptations. To mitigate forgetting in unseen categories, we propose MR-GDINO, a strong, efficient and scalable baseline via memory and retrieval mechanisms within a highly scalable memory pool. Experimental results show that existing continual detectors suffer from severe forgetting for both seen and unseen categories. In contrast, MR-GDINO largely mitigates forgetting with only 0.1% activated extra parameters, achieving state-of-the-art performance for old, new, and unseen categories.
Related papers
- OpenNet: Incremental Learning for Autonomous Driving Object Detection
with Balanced Loss [3.761247766448379]
The proposed method can obtain better performance than that of the existing methods.
The Experimental results upon the CODA dataset show that the proposed method can obtain better performance than that of the existing methods.
arXiv Detail & Related papers (2023-11-25T06:02:50Z) - Fast Hierarchical Learning for Few-Shot Object Detection [57.024072600597464]
Transfer learning approaches have recently achieved promising results on the few-shot detection task.
These approaches suffer from catastrophic forgetting'' issue due to finetuning of base detector.
We tackle the aforementioned issues in this work.
arXiv Detail & Related papers (2022-10-10T20:31:19Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - SHELS: Exclusive Feature Sets for Novelty Detection and Continual
Learning Without Class Boundaries [22.04165296584446]
We introduce a Sparse High-level-Exclusive, Low-level-Shared feature representation (SHELS)
SHELS encourages learning exclusive sets of high-level features and essential, shared low-level features.
We show that using SHELS for novelty detection results in statistically significant improvements over state-of-the-art OOD detection approaches.
arXiv Detail & Related papers (2022-06-28T03:09:55Z) - FOSTER: Feature Boosting and Compression for Class-Incremental Learning [52.603520403933985]
Deep neural networks suffer from catastrophic forgetting when learning new categories.
We propose a novel two-stage learning paradigm FOSTER, empowering the model to learn new categories adaptively.
arXiv Detail & Related papers (2022-04-10T11:38:33Z) - Learning Bayesian Sparse Networks with Full Experience Replay for
Continual Learning [54.7584721943286]
Continual Learning (CL) methods aim to enable machine learning models to learn new tasks without catastrophic forgetting of those that have been previously mastered.
Existing CL approaches often keep a buffer of previously-seen samples, perform knowledge distillation, or use regularization techniques towards this goal.
We propose to only activate and select sparse neurons for learning current and past tasks at any stage.
arXiv Detail & Related papers (2022-02-21T13:25:03Z) - Towards Generalized and Incremental Few-Shot Object Detection [9.033533653482529]
A novel Incremental Few-Shot Object Detection (iFSOD) method is proposed to enable the effective continual learning from few-shot samples.
Specifically, a Double-Branch Framework (DBF) is proposed to decouple the feature representation of base and novel (few-shot) class.
We conduct experiments on both Pascal VOC and MS-COCO, which demonstrate that our method can effectively solve the problem of incremental few-shot detection.
arXiv Detail & Related papers (2021-09-23T12:38:09Z) - Incrementally Zero-Shot Detection by an Extreme Value Analyzer [0.0]
This paper introduces a novel strategy for both zero-shot learning and class-incremental learning in real-world object detection.
We propose a novel extreme value analyzer to detect objects from old seen, new seen, and unseen classes, simultaneously.
Experiments demonstrate the efficacy of our model in detecting objects from both the seen and unseen classes, outperforming the alternative models on Pascal VOC and MSCOCO datasets.
arXiv Detail & Related papers (2021-03-23T15:06:30Z) - SID: Incremental Learning for Anchor-Free Object Detection via Selective
and Inter-Related Distillation [16.281712605385316]
Incremental learning requires a model to continually learn new tasks from streaming data.
Traditional fine-tuning of a well-trained deep neural network on a new task will dramatically degrade performance on the old task.
We propose a novel incremental learning paradigm called Selective and Inter-related Distillation (SID)
arXiv Detail & Related papers (2020-12-31T04:12:06Z) - Incremental Object Detection via Meta-Learning [77.55310507917012]
We propose a meta-learning approach that learns to reshape model gradients, such that information across incremental tasks is optimally shared.
In comparison to existing meta-learning methods, our approach is task-agnostic, allows incremental addition of new-classes and scales to high-capacity models for object detection.
arXiv Detail & Related papers (2020-03-17T13:40:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.