FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning
- URL: http://arxiv.org/abs/2211.13131v2
- Date: Tue, 28 Nov 2023 15:41:46 GMT
- Title: FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning
- Authors: Gr\'egoire Petit, Adrian Popescu, Hugo Schindler, David Picard,
Bertrand Delezoide
- Abstract summary: A balance between stability and plasticity of the incremental process is needed in order to obtain good accuracy for past as well as new classes.
Existing exemplar-free class-incremental methods focus either on successive fine tuning of the model, thus favoring plasticity, or on using a feature extractor fixed after the initial incremental state.
We introduce a method which combines a fixed feature extractor and a pseudo-features generator to improve the stability-plasticity balance.
- Score: 40.74872446895684
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Exemplar-free class-incremental learning is very challenging due to the
negative effect of catastrophic forgetting. A balance between stability and
plasticity of the incremental process is needed in order to obtain good
accuracy for past as well as new classes. Existing exemplar-free
class-incremental methods focus either on successive fine tuning of the model,
thus favoring plasticity, or on using a feature extractor fixed after the
initial incremental state, thus favoring stability. We introduce a method which
combines a fixed feature extractor and a pseudo-features generator to improve
the stability-plasticity balance. The generator uses a simple yet effective
geometric translation of new class features to create representations of past
classes, made of pseudo-features. The translation of features only requires the
storage of the centroid representations of past classes to produce their
pseudo-features. Actual features of new classes and pseudo-features of past
classes are fed into a linear classifier which is trained incrementally to
discriminate between all classes. The incremental process is much faster with
the proposed method compared to mainstream ones which update the entire deep
model. Experiments are performed with three challenging datasets, and different
incremental settings. A comparison with ten existing methods shows that our
method outperforms the others in most cases.
Related papers
- Efficient Non-Exemplar Class-Incremental Learning with Retrospective Feature Synthesis [21.348252135252412]
Current Non-Exemplar Class-Incremental Learning (NECIL) methods mitigate forgetting by storing a single prototype per class.
We propose a more efficient NECIL method that replaces prototypes with synthesized retrospective features for old classes.
Our method significantly improves the efficiency of non-exemplar class-incremental learning and achieves state-of-the-art performance.
arXiv Detail & Related papers (2024-11-03T07:19:11Z) - Early Preparation Pays Off: New Classifier Pre-tuning for Class Incremental Semantic Segmentation [13.62129805799111]
Class incremental semantic segmentation aims to preserve old knowledge while learning new tasks.
It is impeded by catastrophic forgetting and background shift issues.
We propose a new classifier pre-tuning(NeST) method applied before the formal training process.
arXiv Detail & Related papers (2024-07-19T09:19:29Z) - PASS++: A Dual Bias Reduction Framework for Non-Exemplar Class-Incremental Learning [49.240408681098906]
Class-incremental learning (CIL) aims to recognize new classes incrementally while maintaining the discriminability of old classes.
Most existing CIL methods are exemplar-based, i.e., storing a part of old data for retraining.
We present a simple and novel dual bias reduction framework that employs self-supervised transformation (SST) in input space and prototype augmentation (protoAug) in deep feature space.
arXiv Detail & Related papers (2024-07-19T05:03:16Z) - Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - FeCAM: Exploiting the Heterogeneity of Class Distributions in
Exemplar-Free Continual Learning [21.088762527081883]
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits the rehearsal of data from previous tasks.
Recent approaches to incrementally learning the classifier by freezing the feature extractor after the first task have gained much attention.
We explore prototypical networks for CIL, which generate new class prototypes using the frozen feature extractor and classify the features based on the Euclidean distance to the prototypes.
arXiv Detail & Related papers (2023-09-25T11:54:33Z) - Non-exemplar Class-incremental Learning by Random Auxiliary Classes
Augmentation and Mixed Features [37.51376572211081]
Non-exemplar class-incremental learning refers to classifying new and old classes without storing samples of old classes.
We propose an effective non-exemplar method called RAMF consisting of Random Auxiliary classes augmentation and Mixed Feature.
arXiv Detail & Related papers (2023-04-16T06:33:43Z) - On the Stability-Plasticity Dilemma of Class-Incremental Learning [50.863180812727244]
A primary goal of class-incremental learning is to strike a balance between stability and plasticity.
This paper aims to shed light on how effectively recent class-incremental learning algorithms address the stability-plasticity trade-off.
arXiv Detail & Related papers (2023-04-04T09:34:14Z) - PlaStIL: Plastic and Stable Memory-Free Class-Incremental Learning [49.0417577439298]
Plasticity and stability are needed in class-incremental learning in order to learn from new data while preserving past knowledge.
We propose a method which has similar number of parameters but distributes them differently to find a better balance between plasticity and stability.
arXiv Detail & Related papers (2022-09-14T12:53:00Z) - Self-Promoted Prototype Refinement for Few-Shot Class-Incremental
Learning [81.10531943939365]
Few-shot class-incremental learning is to recognize the new classes given few samples and not forget the old classes.
We propose a novel incremental prototype learning scheme that adapts the feature representation to various generated incremental episodes.
Experiments on three benchmark datasets demonstrate the above-par incremental performance, outperforming state-of-the-art methods by a margin of 13%, 17% and 11%, respectively.
arXiv Detail & Related papers (2021-07-19T14:31:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.