Discriminative Distillation to Reduce Class Confusion in Continual
Learning
- URL: http://arxiv.org/abs/2108.05187v1
- Date: Wed, 11 Aug 2021 12:46:43 GMT
- Title: Discriminative Distillation to Reduce Class Confusion in Continual
Learning
- Authors: Changhong Zhong, Zhiying Cui, Ruixuan Wang, and Wei-Shi Zheng
- Abstract summary: Class confusion may play a role in downgrading the classification performance during continual learning.
We propose a discriminative distillation strategy to help the classify well learn the discriminative features between confusing classes.
- Score: 57.715862676788156
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Successful continual learning of new knowledge would enable intelligent
systems to recognize more and more classes of objects. However, current
intelligent systems often fail to correctly recognize previously learned
classes of objects when updated to learn new classes. It is widely believed
that such downgraded performance is solely due to the catastrophic forgetting
of previously learned knowledge. In this study, we argue that the class
confusion phenomena may also play a role in downgrading the classification
performance during continual learning, i.e., the high similarity between new
classes and any previously learned classes would also cause the classifier to
make mistakes in recognizing these old classes, even if the knowledge of these
old classes is not forgotten. To alleviate the class confusion issue, we
propose a discriminative distillation strategy to help the classify well learn
the discriminative features between confusing classes during continual
learning. Experiments on multiple natural image classification tasks support
that the proposed distillation strategy, when combined with existing methods,
is effective in further improving continual learning.
Related papers
- Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - Class-Incremental Learning: A Survey [84.30083092434938]
Class-Incremental Learning (CIL) enables the learner to incorporate the knowledge of new classes incrementally.
CIL tends to catastrophically forget the characteristics of former ones, and its performance drastically degrades.
We provide a rigorous and unified evaluation of 17 methods in benchmark image classification tasks to find out the characteristics of different algorithms.
arXiv Detail & Related papers (2023-02-07T17:59:05Z) - Multi-Granularity Regularized Re-Balancing for Class Incremental
Learning [32.52884416761171]
Deep learning models suffer from catastrophic forgetting when learning new tasks.
Data imbalance between old and new classes is a key issue that leads to performance degradation of the model.
We propose an assumption-agnostic method, Multi-Granularity Regularized re-Balancing, to address this problem.
arXiv Detail & Related papers (2022-06-30T11:04:51Z) - Continual Learning with Bayesian Model based on a Fixed Pre-trained
Feature Extractor [55.9023096444383]
Current deep learning models are characterised by catastrophic forgetting of old knowledge when learning new classes.
Inspired by the process of learning new knowledge in human brains, we propose a Bayesian generative model for continual learning.
arXiv Detail & Related papers (2022-04-28T08:41:51Z) - Long-tail Recognition via Compositional Knowledge Transfer [60.03764547406601]
We introduce a novel strategy for long-tail recognition that addresses the tail classes' few-shot problem.
Our objective is to transfer knowledge acquired from information-rich common classes to semantically similar, and yet data-hungry, rare classes.
Experiments show that our approach can achieve significant performance boosts on rare classes while maintaining robust common class performance.
arXiv Detail & Related papers (2021-12-13T15:48:59Z) - Preserving Earlier Knowledge in Continual Learning with the Help of All
Previous Feature Extractors [63.21036904487014]
Continual learning of new knowledge over time is one desirable capability for intelligent systems to recognize more and more classes of objects.
We propose a simple yet effective fusion mechanism by including all the previously learned feature extractors into the intelligent model.
Experiments on multiple classification tasks show that the proposed approach can effectively reduce the forgetting of old knowledge, achieving state-of-the-art continual learning performance.
arXiv Detail & Related papers (2021-04-28T07:49:24Z) - Essentials for Class Incremental Learning [43.306374557919646]
Class-incremental learning results on CIFAR-100 and ImageNet improve over the state-of-the-art by a large margin, while keeping the approach simple.
arXiv Detail & Related papers (2021-02-18T18:01:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.