Self-Sustaining Representation Expansion for Non-Exemplar
Class-Incremental Learning
- URL: http://arxiv.org/abs/2203.06359v2
- Date: Wed, 16 Mar 2022 07:02:32 GMT
- Title: Self-Sustaining Representation Expansion for Non-Exemplar
Class-Incremental Learning
- Authors: Kai Zhu, Wei Zhai, Yang Cao, Jiebo Luo, Zheng-Jun Zha
- Abstract summary: Non-exemplar class-incremental learning is to recognize both the old and new classes when old class samples cannot be saved.
Our scheme consists of a structure reorganization strategy that fuses main-branch expansion and side-branch updating to maintain the old features.
A prototype selection mechanism is proposed to enhance the discrimination between the old and new classes by selectively incorporating new samples into the distillation process.
- Score: 138.35405462309456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-exemplar class-incremental learning is to recognize both the old and new
classes when old class samples cannot be saved. It is a challenging task since
representation optimization and feature retention can only be achieved under
supervision from new classes. To address this problem, we propose a novel
self-sustaining representation expansion scheme. Our scheme consists of a
structure reorganization strategy that fuses main-branch expansion and
side-branch updating to maintain the old features, and a main-branch
distillation scheme to transfer the invariant knowledge. Furthermore, a
prototype selection mechanism is proposed to enhance the discrimination between
the old and new classes by selectively incorporating new samples into the
distillation process. Extensive experiments on three benchmarks demonstrate
significant incremental performance, outperforming the state-of-the-art methods
by a margin of 3%, 3% and 6%, respectively.
Related papers
- Efficient Non-Exemplar Class-Incremental Learning with Retrospective Feature Synthesis [21.348252135252412]
Current Non-Exemplar Class-Incremental Learning (NECIL) methods mitigate forgetting by storing a single prototype per class.
We propose a more efficient NECIL method that replaces prototypes with synthesized retrospective features for old classes.
Our method significantly improves the efficiency of non-exemplar class-incremental learning and achieves state-of-the-art performance.
arXiv Detail & Related papers (2024-11-03T07:19:11Z) - Strike a Balance in Continual Panoptic Segmentation [60.26892488010291]
We introduce past-class backtrace distillation to balance the stability of existing knowledge with the adaptability to new information.
We also introduce a class-proportional memory strategy, which aligns the class distribution in the replay sample set with that of the historical training data.
We present a new method named Continual Panoptic Balanced (BalConpas)
arXiv Detail & Related papers (2024-07-23T09:58:20Z) - Tendency-driven Mutual Exclusivity for Weakly Supervised Incremental Semantic Segmentation [56.1776710527814]
Weakly Incremental Learning for Semantic (WILSS) leverages a pre-trained segmentation model to segment new classes using cost-effective and readily available image-level labels.
A prevailing way to solve WILSS is the generation of seed areas for each new class, serving as a form of pixel-level supervision.
We propose an innovative, tendency-driven relationship of mutual exclusivity, meticulously tailored to govern the behavior of the seed areas.
arXiv Detail & Related papers (2024-04-18T08:23:24Z) - CEAT: Continual Expansion and Absorption Transformer for Non-Exemplar
Class-Incremental Learning [34.59310641291726]
In real-world applications, dynamic scenarios require the models to possess the capability to learn new tasks continuously without forgetting the old knowledge.
We propose a new architecture, named continual expansion and absorption transformer(CEAT)
The model can learn the novel knowledge by extending the expanded-fusion layers in parallel with the frozen previous parameters.
To improve the learning ability of the model, we designed a novel prototype contrastive loss to reduce the overlap between old and new classes in the feature space.
arXiv Detail & Related papers (2024-03-11T12:40:12Z) - Non-exemplar Class-incremental Learning by Random Auxiliary Classes
Augmentation and Mixed Features [37.51376572211081]
Non-exemplar class-incremental learning refers to classifying new and old classes without storing samples of old classes.
We propose an effective non-exemplar method called RAMF consisting of Random Auxiliary classes augmentation and Mixed Feature.
arXiv Detail & Related papers (2023-04-16T06:33:43Z) - Self-Paced Imbalance Rectification for Class Incremental Learning [6.966383162917331]
We propose a self-paced imbalance rectification scheme, which dynamically maintains the incremental balance during the representation learning phase.
Experiments on three benchmarks demonstrate stable incremental performance, significantly outperforming the state-of-the-art methods.
arXiv Detail & Related papers (2022-02-08T07:58:13Z) - Weakly Supervised Semantic Segmentation via Alternative Self-Dual
Teaching [82.71578668091914]
This paper establishes a compact learning framework that embeds the classification and mask-refinement components into a unified deep model.
We propose a novel alternative self-dual teaching (ASDT) mechanism to encourage high-quality knowledge interaction.
arXiv Detail & Related papers (2021-12-17T11:56:56Z) - Long-tail Recognition via Compositional Knowledge Transfer [60.03764547406601]
We introduce a novel strategy for long-tail recognition that addresses the tail classes' few-shot problem.
Our objective is to transfer knowledge acquired from information-rich common classes to semantically similar, and yet data-hungry, rare classes.
Experiments show that our approach can achieve significant performance boosts on rare classes while maintaining robust common class performance.
arXiv Detail & Related papers (2021-12-13T15:48:59Z) - Self-Promoted Prototype Refinement for Few-Shot Class-Incremental
Learning [81.10531943939365]
Few-shot class-incremental learning is to recognize the new classes given few samples and not forget the old classes.
We propose a novel incremental prototype learning scheme that adapts the feature representation to various generated incremental episodes.
Experiments on three benchmark datasets demonstrate the above-par incremental performance, outperforming state-of-the-art methods by a margin of 13%, 17% and 11%, respectively.
arXiv Detail & Related papers (2021-07-19T14:31:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.