Forward Compatible Few-Shot Class-Incremental Learning
- URL: http://arxiv.org/abs/2203.06953v1
- Date: Mon, 14 Mar 2022 09:36:35 GMT
- Title: Forward Compatible Few-Shot Class-Incremental Learning
- Authors: Da-Wei Zhou, Fu-Yun Wang, Han-Jia Ye, Liang Ma, Shiliang Pu, De-Chuan
Zhan
- Abstract summary: A machine learning model should recognize new classes without forgetting old ones.
Current methods handle incremental learning retrospectively.
We propose ForwArd Compatible Training (FACT) for FSCIL.
- Score: 71.2459746681805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Novel classes frequently arise in our dynamically changing world, e.g., new
users in the authentication system, and a machine learning model should
recognize new classes without forgetting old ones. This scenario becomes more
challenging when new class instances are insufficient, which is called few-shot
class-incremental learning (FSCIL). Current methods handle incremental learning
retrospectively by making the updated model similar to the old one. By
contrast, we suggest learning prospectively to prepare for future updates, and
propose ForwArd Compatible Training (FACT) for FSCIL. Forward compatibility
requires future new classes to be easily incorporated into the current model
based on the current stage data, and we seek to realize it by reserving
embedding space for future new classes. In detail, we assign virtual prototypes
to squeeze the embedding of known classes and reserve for new ones. Besides, we
forecast possible new classes and prepare for the updating process. The virtual
prototypes allow the model to accept possible updates in the future, which act
as proxies scattered among embedding space to build a stronger classifier
during inference. FACT efficiently incorporates new classes with forward
compatibility and meanwhile resists forgetting of old ones. Extensive
experiments validate FACT's state-of-the-art performance. Code is available at:
https://github.com/zhoudw-zdw/CVPR22-Fact
Related papers
- Class-Incremental Learning with CLIP: Adaptive Representation Adjustment and Parameter Fusion [10.322832012497722]
Class-incremental learning is a challenging problem, where the goal is to train a model that can classify data from an increasing number of classes over time.
With the advancement of vision-language pre-trained models such as CLIP, they demonstrate good generalization ability.
However, further adaptation to downstream tasks by simply fine-tuning the model leads to severe forgetting.
Most existing works with pre-trained models assume that the forgetting of old classes is uniform when the model acquires new knowledge.
arXiv Detail & Related papers (2024-07-19T09:20:33Z) - Organizing Background to Explore Latent Classes for Incremental Few-shot Semantic Segmentation [7.570798966278471]
incremental Few-shot Semantic COCO (iFSS) is to extend pre-trained segmentation models to new classes via few annotated images.
We propose a network called OINet, i.e., the background embedding space textbfOrganization and prototype textbfInherit Network.
arXiv Detail & Related papers (2024-05-29T23:22:12Z) - Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning [65.57123249246358]
We propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
We train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces.
Our prototype complement strategy synthesizes old classes' new features without using any old class instance.
arXiv Detail & Related papers (2024-03-18T17:58:13Z) - Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - Class-Incremental Learning: A Survey [84.30083092434938]
Class-Incremental Learning (CIL) enables the learner to incorporate the knowledge of new classes incrementally.
CIL tends to catastrophically forget the characteristics of former ones, and its performance drastically degrades.
We provide a rigorous and unified evaluation of 17 methods in benchmark image classification tasks to find out the characteristics of different algorithms.
arXiv Detail & Related papers (2023-02-07T17:59:05Z) - Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks [59.12108527904171]
A model should recognize new classes and maintain discriminability over old classes.
The task of recognizing few-shot new classes without forgetting old classes is called few-shot class-incremental learning (FSCIL)
We propose a new paradigm for FSCIL based on meta-learning by LearnIng Multi-phase Incremental Tasks (LIMIT)
arXiv Detail & Related papers (2022-03-31T13:46:41Z) - Learning Adaptive Embedding Considering Incremental Class [55.21855842960139]
Class-Incremental Learning (CIL) aims to train a reliable model with the streaming data, which emerges unknown classes sequentially.
Different from traditional closed set learning, CIL has two main challenges: 1) Novel class detection.
After the novel classes are detected, the model needs to be updated without re-training using entire previous data.
arXiv Detail & Related papers (2020-08-31T04:11:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.