Memorizing Complementation Network for Few-Shot Class-Incremental
Learning
- URL: http://arxiv.org/abs/2208.05610v1
- Date: Thu, 11 Aug 2022 02:32:41 GMT
- Title: Memorizing Complementation Network for Few-Shot Class-Incremental
Learning
- Authors: Zhong Ji, Zhishen Hou, Xiyao Liu, Yanwei Pang, Xuelong Li
- Abstract summary: We propose a Memorizing Complementation Network (MCNet) to ensemble multiple models that complements the different memorized knowledge with each other in novel tasks.
We develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel samples away from not only each other in current task but also the old distribution.
- Score: 109.4206979528375
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot Class-Incremental Learning (FSCIL) aims at learning new concepts
continually with only a few samples, which is prone to suffer the catastrophic
forgetting and overfitting problems. The inaccessibility of old classes and the
scarcity of the novel samples make it formidable to realize the trade-off
between retaining old knowledge and learning novel concepts. Inspired by that
different models memorize different knowledge when learning novel concepts, we
propose a Memorizing Complementation Network (MCNet) to ensemble multiple
models that complements the different memorized knowledge with each other in
novel tasks. Additionally, to update the model with few novel samples, we
develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel
samples away from not only each other in current task but also the old
distribution. Extensive experiments on three benchmark datasets, e.g.,
CIFAR100, miniImageNet and CUB200, have demonstrated the superiority of our
proposed method.
Related papers
- Adaptive Discovering and Merging for Incremental Novel Class Discovery [37.54881512688964]
We introduce a new paradigm called Adaptive Discovering and Merging (ADM) to discover novel categories adaptively in the incremental stage.
Our AMM also benefits the class-incremental Learning (class-IL) task by alleviating the catastrophic forgetting problem.
arXiv Detail & Related papers (2024-03-06T00:17:03Z) - Learning Prompt with Distribution-Based Feature Replay for Few-Shot Class-Incremental Learning [56.29097276129473]
We propose a simple yet effective framework, named Learning Prompt with Distribution-based Feature Replay (LP-DiF)
To prevent the learnable prompt from forgetting old knowledge in the new session, we propose a pseudo-feature replay approach.
When progressing to a new session, pseudo-features are sampled from old-class distributions combined with training images of the current session to optimize the prompt.
arXiv Detail & Related papers (2024-01-03T07:59:17Z) - TLCE: Transfer-Learning Based Classifier Ensembles for Few-Shot
Class-Incremental Learning [5.753740302289126]
Few-shot class-incremental learning (FSCIL) struggles to incrementally recognize novel classes from few examples.
We propose TLCE, which ensembles multiple pre-trained models to improve separation of novel and old classes.
arXiv Detail & Related papers (2023-12-07T11:16:00Z) - MINI: Mining Implicit Novel Instances for Few-Shot Object Detection [73.5061386065382]
Mining Implicit Novel Instances (MINI) is a novel framework to mine implicit novel instances as auxiliary training samples.
MINI achieves new state-of-the-art performance on any shot and split.
arXiv Detail & Related papers (2022-05-06T17:26:48Z) - Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks [59.12108527904171]
A model should recognize new classes and maintain discriminability over old classes.
The task of recognizing few-shot new classes without forgetting old classes is called few-shot class-incremental learning (FSCIL)
We propose a new paradigm for FSCIL based on meta-learning by LearnIng Multi-phase Incremental Tasks (LIMIT)
arXiv Detail & Related papers (2022-03-31T13:46:41Z) - Online Deep Metric Learning via Mutual Distillation [9.363111089877625]
Deep metric learning aims to transform input data into an embedding space, where similar samples are close while dissimilar samples are far apart from each other.
Existing solutions either retrain the model from scratch or require the replay of old samples during the training.
This paper proposes a complete online deep metric learning framework based on mutual distillation for both one-task and multi-task scenarios.
arXiv Detail & Related papers (2022-03-10T07:24:36Z) - Unsupervised Transfer Learning for Spatiotemporal Predictive Networks [90.67309545798224]
We study how to transfer knowledge from a zoo of unsupervisedly learned models towards another network.
Our motivation is that models are expected to understand complex dynamics from different sources.
Our approach yields significant improvements on three benchmarks fortemporal prediction, and benefits the target even from less relevant ones.
arXiv Detail & Related papers (2020-09-24T15:40:55Z) - Few-Shot Class-Incremental Learning [68.75462849428196]
We focus on a challenging but practical few-shot class-incremental learning (FSCIL) problem.
FSCIL requires CNN models to incrementally learn new classes from very few labelled samples, without forgetting the previously learned ones.
We represent the knowledge using a neural gas (NG) network, which can learn and preserve the topology of the feature manifold formed by different classes.
arXiv Detail & Related papers (2020-04-23T03:38:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.