Organizing Background to Explore Latent Classes for Incremental Few-shot Semantic Segmentation
- URL: http://arxiv.org/abs/2405.19568v1
- Date: Wed, 29 May 2024 23:22:12 GMT
- Title: Organizing Background to Explore Latent Classes for Incremental Few-shot Semantic Segmentation
- Authors: Lianlei Shan, Wenzhang Zhou, Wei Li, Xingyu Ding,
- Abstract summary: incremental Few-shot Semantic COCO (iFSS) is to extend pre-trained segmentation models to new classes via few annotated images.
We propose a network called OINet, i.e., the background embedding space textbfOrganization and prototype textbfInherit Network.
- Score: 7.570798966278471
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The goal of incremental Few-shot Semantic Segmentation (iFSS) is to extend pre-trained segmentation models to new classes via few annotated images without access to old training data. During incrementally learning novel classes, the data distribution of old classes will be destroyed, leading to catastrophic forgetting. Meanwhile, the novel classes have only few samples, making models impossible to learn the satisfying representations of novel classes. For the iFSS problem, we propose a network called OINet, i.e., the background embedding space \textbf{O}rganization and prototype \textbf{I}nherit Network. Specifically, when training base classes, OINet uses multiple classification heads for the background and sets multiple sub-class prototypes to reserve embedding space for the latent novel classes. During incrementally learning novel classes, we propose a strategy to select the sub-class prototypes that best match the current learning novel classes and make the novel classes inherit the selected prototypes' embedding space. This operation allows the novel classes to be registered in the embedding space using few samples without affecting the distribution of the base classes. Results on Pascal-VOC and COCO show that OINet achieves a new state of the art.
Related papers
- Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning [65.57123249246358]
We propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
We train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces.
Our prototype complement strategy synthesizes old classes' new features without using any old class instance.
arXiv Detail & Related papers (2024-03-18T17:58:13Z) - ProxyDet: Synthesizing Proxy Novel Classes via Classwise Mixup for
Open-Vocabulary Object Detection [7.122652901894367]
Open-vocabulary object detection (OVOD) aims to recognize novel objects whose categories are not included in the training set.
We present a novel, yet simple technique that helps generalization on the overall distribution of novel classes.
arXiv Detail & Related papers (2023-12-12T13:45:56Z) - Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - Incremental Few-Shot Learning via Implanting and Compressing [13.122771115838523]
Incremental Few-Shot Learning requires a model to continually learn novel classes from only a few examples.
We propose a two-step learning strategy referred to as textbfImplanting and textbfCompressing.
Specifically, in the textbfImplanting step, we propose to mimic the data distribution of novel classes with the assistance of data-abundant base set.
In the textbf step, we adapt the feature extractor to precisely represent each novel class for enhancing intra-class compactness.
arXiv Detail & Related papers (2022-03-19T11:04:43Z) - Forward Compatible Few-Shot Class-Incremental Learning [71.2459746681805]
A machine learning model should recognize new classes without forgetting old ones.
Current methods handle incremental learning retrospectively.
We propose ForwArd Compatible Training (FACT) for FSCIL.
arXiv Detail & Related papers (2022-03-14T09:36:35Z) - Few-Shot Object Detection via Association and DIscrimination [83.8472428718097]
Few-shot object detection via Association and DIscrimination builds up a discriminative feature space for each novel class with two integral steps.
Experiments on Pascal VOC and MS-COCO datasets demonstrate FADI achieves new SOTA performance, significantly improving the baseline in any shot/split by +18.7.
arXiv Detail & Related papers (2021-11-23T05:04:06Z) - Learning Adaptive Embedding Considering Incremental Class [55.21855842960139]
Class-Incremental Learning (CIL) aims to train a reliable model with the streaming data, which emerges unknown classes sequentially.
Different from traditional closed set learning, CIL has two main challenges: 1) Novel class detection.
After the novel classes are detected, the model needs to be updated without re-training using entire previous data.
arXiv Detail & Related papers (2020-08-31T04:11:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.