Evolving Dictionary Representation for Few-shot Class-incremental
Learning
- URL: http://arxiv.org/abs/2305.01885v1
- Date: Wed, 3 May 2023 04:30:34 GMT
- Title: Evolving Dictionary Representation for Few-shot Class-incremental
Learning
- Authors: Xuejun Han, Yuhong Guo
- Abstract summary: We tackle a challenging and practical continual learning scenario named few-shot class-incremental learning (FSCIL)
InFSCIL, labeled data are given for classes in a base session but very limited labeled instances are available for new incremental classes.
We propose deep dictionary learning which is a hybrid learning architecture that combines dictionary learning and visual representation learning.
- Score: 34.887690018011675
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: New objects are continuously emerging in the dynamically changing world and a
real-world artificial intelligence system should be capable of continual and
effectual adaptation to new emerging classes without forgetting old ones. In
view of this, in this paper we tackle a challenging and practical continual
learning scenario named few-shot class-incremental learning (FSCIL), in which
labeled data are given for classes in a base session but very limited labeled
instances are available for new incremental classes. To address this problem,
we propose a novel and succinct approach by introducing deep dictionary
learning which is a hybrid learning architecture that combines dictionary
learning and visual representation learning to provide a better space for
characterizing different classes. We simultaneously optimize the dictionary and
the feature extraction backbone in the base session, while only finetune the
dictionary in the incremental session for adaptation to novel classes, which
can alleviate the forgetting on base classes compared to finetuning the entire
model. To further facilitate future adaptation, we also incorporate multiple
pseudo classes into the base session training so that certain space projected
by dictionary can be reserved for future new concepts. The extensive
experimental results on CIFAR100, miniImageNet and CUB200 validate the
effectiveness of our approach compared to other SOTA methods.
Related papers
- Few Shot Class Incremental Learning using Vision-Language models [24.930246674021525]
In this study, we introduce an innovative few-shot class incremental learning (FSCIL) framework that utilizes language regularizer and subspace regularizer.
Our proposed framework not only empowers the model to embrace novel classes with limited data, but also ensures the preservation of performance on base classes.
arXiv Detail & Related papers (2024-05-02T06:52:49Z) - Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning [65.57123249246358]
We propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
We train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces.
Our prototype complement strategy synthesizes old classes' new features without using any old class instance.
arXiv Detail & Related papers (2024-03-18T17:58:13Z) - Continuously Learning New Words in Automatic Speech Recognition [56.972851337263755]
We propose an self-supervised continual learning approach to recognize new words.
We use a memory-enhanced Automatic Speech Recognition model from previous work.
We show that with this approach, we obtain increasing performance on the new words when they occur more frequently.
arXiv Detail & Related papers (2024-01-09T10:39:17Z) - Learning to Name Classes for Vision and Language Models [57.0059455405424]
Large scale vision and language models can achieve impressive zero-shot recognition performance by mapping class specific text queries to image content.
We propose to leverage available data to learn, for each class, an optimal word embedding as a function of the visual content.
By learning new word embeddings on an otherwise frozen model, we are able to retain zero-shot capabilities for new classes, easily adapt models to new datasets, and adjust potentially erroneous, non-descriptive or ambiguous class names.
arXiv Detail & Related papers (2023-04-04T14:34:44Z) - Improving Feature Generalizability with Multitask Learning in Class
Incremental Learning [12.632121107536843]
Many deep learning applications, like keyword spotting, require the incorporation of new concepts (classes) over time, referred to as Class Incremental Learning (CIL)
The major challenge in CIL is catastrophic forgetting, i.e., preserving as much of the old knowledge as possible while learning new tasks.
We propose multitask learning during base model training to improve the feature generalizability.
Our approach enhances the average incremental learning accuracy by up to 5.5%, which enables more reliable and accurate keyword spotting over time.
arXiv Detail & Related papers (2022-04-26T07:47:54Z) - Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks [59.12108527904171]
A model should recognize new classes and maintain discriminability over old classes.
The task of recognizing few-shot new classes without forgetting old classes is called few-shot class-incremental learning (FSCIL)
We propose a new paradigm for FSCIL based on meta-learning by LearnIng Multi-phase Incremental Tasks (LIMIT)
arXiv Detail & Related papers (2022-03-31T13:46:41Z) - Better Language Model with Hypernym Class Prediction [101.8517004687825]
Class-based language models (LMs) have been long devised to address context sparsity in $n$-gram LMs.
In this study, we revisit this approach in the context of neural LMs.
arXiv Detail & Related papers (2022-03-21T01:16:44Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z) - Few-Shot Incremental Learning with Continually Evolved Classifiers [46.278573301326276]
Few-shot class-incremental learning (FSCIL) aims to design machine learning algorithms that can continually learn new concepts from a few data points.
The difficulty lies in that limited data from new classes not only lead to significant overfitting issues but also exacerbate the notorious catastrophic forgetting problems.
We propose a Continually Evolved CIF ( CEC) that employs a graph model to propagate context information between classifiers for adaptation.
arXiv Detail & Related papers (2021-04-07T10:54:51Z) - Deep Semantic Dictionary Learning for Multi-label Image Classification [3.3989824361632337]
We present an innovative path towards the solution of the multi-label image classification which considers it as a dictionary learning task.
A novel end-to-end model named Deep Semantic Dictionary Learning (DSDL) is designed.
Our codes and models have been released.
arXiv Detail & Related papers (2020-12-23T06:22:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.