RECALL: Rehearsal-free Continual Learning for Object Classification
- URL: http://arxiv.org/abs/2209.14774v1
- Date: Thu, 29 Sep 2022 13:36:28 GMT
- Title: RECALL: Rehearsal-free Continual Learning for Object Classification
- Authors: Markus Knauer, Maximilian Denninger and Rudolph Triebel
- Abstract summary: Convolutional neural networks show remarkable results in classification but struggle with learning new things on the fly.
We present a novel rehearsal-free approach, where a deep neural network is continually learning new unseen object categories.
Our approach is called RECALL, as the network recalls categories by calculating logits for old categories before training new ones.
- Score: 24.260824374268314
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Convolutional neural networks show remarkable results in classification but
struggle with learning new things on the fly. We present a novel rehearsal-free
approach, where a deep neural network is continually learning new unseen object
categories without saving any data of prior sequences. Our approach is called
RECALL, as the network recalls categories by calculating logits for old
categories before training new ones. These are then used during training to
avoid changing the old categories. For each new sequence, a new head is added
to accommodate the new categories. To mitigate forgetting, we present a
regularization strategy where we replace the classification with a regression.
Moreover, for the known categories, we propose a Mahalanobis loss that includes
the variances to account for the changing densities between known and unknown
categories. Finally, we present a novel dataset for continual learning,
especially suited for object recognition on a mobile robot (HOWS-CL-25),
including 150,795 synthetic images of 25 household object categories. Our
approach RECALL outperforms the current state of the art on CORe50 and
iCIFAR-100 and reaches the best performance on HOWS-CL-25.
Related papers
- NC-NCD: Novel Class Discovery for Node Classification [28.308556235456766]
Class Discovery (NCD) involves identifying new categories within unlabeled data by utilizing knowledge acquired from previously established categories.
Existing NCD methods often struggle to maintain a balance between the performance of old and new categories.
We introduce for the first time a more practical NCD scenario for node classification (i.e., NC-NCD)
We propose a novel self-training framework with prototype replay and distillation called SWORD, adopted to our NC-NCD setting.
arXiv Detail & Related papers (2024-07-25T07:10:08Z) - CBR - Boosting Adaptive Classification By Retrieval of Encrypted Network Traffic with Out-of-distribution [9.693391036125908]
One of the common approaches is using Machine learning or Deep Learning-based solutions on a fixed number of classes.
One of the solutions for handling unknown classes is to retrain the model, however, retraining models every time they become obsolete is both resource and time-consuming.
In this paper, we introduce Adaptive Classification By Retrieval CBR, a novel approach for encrypted network traffic classification.
arXiv Detail & Related papers (2024-03-17T13:14:09Z) - Fixed Random Classifier Rearrangement for Continual Learning [0.5439020425819]
In visual classification scenario, neural networks inevitably forget the knowledge of old tasks after learning new ones.
We propose a continual learning algorithm named Fixed Random Rearrangement (FRCR)
arXiv Detail & Related papers (2024-02-23T09:43:58Z) - Category Adaptation Meets Projected Distillation in Generalized Continual Category Discovery [0.9349784561232036]
Generalized Continual Category Discovery (GCCD) tackles learning from sequentially arriving, partially labeled datasets.
We introduce a novel technique integrating a learnable projector with feature distillation, thus enhancing model adaptability without sacrificing past knowledge.
We demonstrate that while each component offers modest benefits individually, their combination - dubbed CAMP - significantly improves the balance between learning new information and retaining old.
arXiv Detail & Related papers (2023-08-23T13:02:52Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - FOSTER: Feature Boosting and Compression for Class-Incremental Learning [52.603520403933985]
Deep neural networks suffer from catastrophic forgetting when learning new categories.
We propose a novel two-stage learning paradigm FOSTER, empowering the model to learn new categories adaptively.
arXiv Detail & Related papers (2022-04-10T11:38:33Z) - Move-to-Data: A new Continual Learning approach with Deep CNNs,
Application for image-class recognition [0.0]
It is necessary to pre-train the model at a "training recording phase" and then adjust it to the new coming data.
We propose a fast continual learning layer at the end of the neuronal network.
arXiv Detail & Related papers (2020-06-12T13:04:58Z) - Semantic Drift Compensation for Class-Incremental Learning [48.749630494026086]
Class-incremental learning of deep networks sequentially increases the number of classes to be classified.
We propose a new method to estimate the drift, called semantic drift, of features and compensate for it without the need of any exemplars.
arXiv Detail & Related papers (2020-04-01T13:31:19Z) - Equalization Loss for Long-Tailed Object Recognition [109.91045951333835]
State-of-the-art object detection methods still perform poorly on large vocabulary and long-tailed datasets.
We propose a simple but effective loss, named equalization loss, to tackle the problem of long-tailed rare categories.
Our method achieves AP gains of 4.1% and 4.8% for the rare and common categories on the challenging LVIS benchmark.
arXiv Detail & Related papers (2020-03-11T09:14:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.