S3C: Self-Supervised Stochastic Classifiers for Few-Shot
Class-Incremental Learning
- URL: http://arxiv.org/abs/2307.02246v1
- Date: Wed, 5 Jul 2023 12:41:46 GMT
- Title: S3C: Self-Supervised Stochastic Classifiers for Few-Shot
Class-Incremental Learning
- Authors: Jayateja Kalla and Soma Biswas
- Abstract summary: Few-shot class-incremental learning (FSCIL) aims to learn progressively about new classes with very few labeled samples, without forgetting the knowledge of already learnt classes.
FSCIL suffers from two major challenges: (i) over-fitting on the new classes due to limited amount of data, (ii) catastrophically forgetting about the old classes due to unavailability of data from these classes in the incremental stages.
- Score: 22.243176199188238
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Few-shot class-incremental learning (FSCIL) aims to learn progressively about
new classes with very few labeled samples, without forgetting the knowledge of
already learnt classes. FSCIL suffers from two major challenges: (i)
over-fitting on the new classes due to limited amount of data, (ii)
catastrophically forgetting about the old classes due to unavailability of data
from these classes in the incremental stages. In this work, we propose a
self-supervised stochastic classifier (S3C) to counter both these challenges in
FSCIL. The stochasticity of the classifier weights (or class prototypes) not
only mitigates the adverse effect of absence of large number of samples of the
new classes, but also the absence of samples from previously learnt classes
during the incremental steps. This is complemented by the self-supervision
component, which helps to learn features from the base classes which generalize
well to unseen classes that are encountered in future, thus reducing
catastrophic forgetting. Extensive evaluation on three benchmark datasets using
multiple evaluation metrics show the effectiveness of the proposed framework.
We also experiment on two additional realistic scenarios of FSCIL, namely where
the number of annotated data available for each of the new classes can be
different, and also where the number of base classes is much lesser, and show
that the proposed S3C performs significantly better than the state-of-the-art
for all these challenging scenarios.
Related papers
- Covariance-based Space Regularization for Few-shot Class Incremental Learning [25.435192867105552]
Few-shot Class Incremental Learning (FSCIL) requires the model to continually learn new classes with limited labeled data.
Due to the limited data in incremental sessions, models are prone to overfitting new classes and suffering catastrophic forgetting of base classes.
Recent advancements resort to prototype-based approaches to constrain the base class distribution and learn discriminative representations of new classes.
arXiv Detail & Related papers (2024-11-02T08:03:04Z) - Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - Neural Collapse Terminus: A Unified Solution for Class Incremental
Learning and Its Variants [166.916517335816]
In this paper, we offer a unified solution to the misalignment dilemma in the three tasks.
We propose neural collapse terminus that is a fixed structure with the maximal equiangular inter-class separation for the whole label space.
Our method holds the neural collapse optimality in an incremental fashion regardless of data imbalance or data scarcity.
arXiv Detail & Related papers (2023-08-03T13:09:59Z) - Class-Imbalanced Complementary-Label Learning via Weighted Loss [8.934943507699131]
Complementary-label learning (CLL) is widely used in weakly supervised classification.
It faces a significant challenge in real-world datasets when confronted with class-imbalanced training samples.
We propose a novel problem setting that enables learning from class-imbalanced complementary labels for multi-class classification.
arXiv Detail & Related papers (2022-09-28T16:02:42Z) - Self-Supervised Class Incremental Learning [51.62542103481908]
Existing Class Incremental Learning (CIL) methods are based on a supervised classification framework sensitive to data labels.
When updating them based on the new class data, they suffer from catastrophic forgetting: the model cannot discern old class data clearly from the new.
In this paper, we explore the performance of Self-Supervised representation learning in Class Incremental Learning (SSCIL) for the first time.
arXiv Detail & Related papers (2021-11-18T06:58:19Z) - Generalized and Incremental Few-Shot Learning by Explicit Learning and
Calibration without Forgetting [86.56447683502951]
We propose a three-stage framework that allows to explicitly and effectively address these challenges.
We evaluate the proposed framework on four challenging benchmark datasets for image and video few-shot classification.
arXiv Detail & Related papers (2021-08-18T14:21:43Z) - Entropy-Based Uncertainty Calibration for Generalized Zero-Shot Learning [49.04790688256481]
The goal of generalized zero-shot learning (GZSL) is to recognise both seen and unseen classes.
Most GZSL methods typically learn to synthesise visual representations from semantic information on the unseen classes.
We propose a novel framework that leverages dual variational autoencoders with a triplet loss to learn discriminative latent features.
arXiv Detail & Related papers (2021-01-09T05:21:27Z) - M2m: Imbalanced Classification via Major-to-minor Translation [79.09018382489506]
In most real-world scenarios, labeled training datasets are highly class-imbalanced, where deep neural networks suffer from generalizing to a balanced testing criterion.
In this paper, we explore a novel yet simple way to alleviate this issue by augmenting less-frequent classes via translating samples from more-frequent classes.
Our experimental results on a variety of class-imbalanced datasets show that the proposed method improves the generalization on minority classes significantly compared to other existing re-sampling or re-weighting methods.
arXiv Detail & Related papers (2020-04-01T13:21:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.