SSUL: Semantic Segmentation with Unknown Label for Exemplar-based
Class-Incremental Learning
- URL: http://arxiv.org/abs/2106.11562v1
- Date: Tue, 22 Jun 2021 06:40:26 GMT
- Title: SSUL: Semantic Segmentation with Unknown Label for Exemplar-based
Class-Incremental Learning
- Authors: Sungmin Cha. Beomyoung Kim, Youngjoon Yoo and Taesup Moon
- Abstract summary: We consider a class-incremental semantic segmentation (CISS) problem.
We propose a new method, dubbed as SSUL-M (Semantic with Unknown Label with Memory), by carefully combining several techniques tailored for semantic segmentation.
We show our method achieves significantly better performance than the recent state-of-the-art baselines on the standard benchmark datasets.
- Score: 19.152041362805985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider a class-incremental semantic segmentation (CISS) problem. While
some recently proposed algorithms utilized variants of knowledge distillation
(KD) technique to tackle the problem, they only partially addressed the key
additional challenges in CISS that causes the catastrophic forgetting; i.e.,
the semantic drift of the background class and multi-label prediction issue. To
better address these challenges, we propose a new method, dubbed as SSUL-M
(Semantic Segmentation with Unknown Label with Memory), by carefully combining
several techniques tailored for semantic segmentation. More specifically, we
make three main contributions; (1) modeling unknown class within the background
class to help learning future classes (help plasticity), (2) freezing backbone
network and past classifiers with binary cross-entropy loss and pseudo-labeling
to overcome catastrophic forgetting (help stability), and (3) utilizing tiny
exemplar memory for the first time in CISS to improve both plasticity and
stability. As a result, we show our method achieves significantly better
performance than the recent state-of-the-art baselines on the standard
benchmark datasets. Furthermore, we justify our contributions with thorough and
extensive ablation analyses and discuss different natures of the CISS problem
compared to the standard class-incremental learning for classification.
Related papers
- Inherit with Distillation and Evolve with Contrast: Exploring Class
Incremental Semantic Segmentation Without Exemplar Memory [23.730424035141155]
We present IDEC, which consists of a Dense Knowledge Distillation on all Aspects (DADA) and an Asymmetric Region-wise Contrastive Learning (ARCL) module.
We demonstrate the effectiveness of our method on multiple CISS tasks by state-of-the-art performance, including Pascal VOC 2012, ADE20K and ISPRS datasets.
arXiv Detail & Related papers (2023-09-27T05:38:31Z) - Neural Collapse Terminus: A Unified Solution for Class Incremental
Learning and Its Variants [166.916517335816]
In this paper, we offer a unified solution to the misalignment dilemma in the three tasks.
We propose neural collapse terminus that is a fixed structure with the maximal equiangular inter-class separation for the whole label space.
Our method holds the neural collapse optimality in an incremental fashion regardless of data imbalance or data scarcity.
arXiv Detail & Related papers (2023-08-03T13:09:59Z) - Prototypical quadruplet for few-shot class incremental learning [24.814045065163135]
We propose a novel method that improves classification robustness by identifying a better embedding space using an improved contrasting loss.
Our approach retains previously acquired knowledge in the embedding space, even when trained with new classes.
We demonstrate the effectiveness of our method by showing that the embedding space remains intact after training the model with new classes and outperforms existing state-of-the-art algorithms in terms of accuracy across different sessions.
arXiv Detail & Related papers (2022-11-05T17:19:14Z) - Attribution-aware Weight Transfer: A Warm-Start Initialization for
Class-Incremental Semantic Segmentation [38.52441363934223]
In class-incremental semantic segmentation (CISS), deep learning architectures suffer from the critical problems of catastrophic forgetting and semantic background shift.
We propose a novel method which employs gradient-based attribution to identify the most relevant weights for new classes.
Our experiments demonstrate significant improvement in mIoU compared to the state-of-the-art CISS methods on the Pascal-VOC 2012, ADE20K and Cityscapes datasets.
arXiv Detail & Related papers (2022-10-13T17:32:12Z) - Decomposed Knowledge Distillation for Class-Incremental Semantic
Segmentation [34.460973847554364]
Class-incremental semantic segmentation (CISS) labels each pixel of an image with a corresponding object/stuff class continually.
It is crucial to learn novel classes incrementally without forgetting previously learned knowledge.
We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively.
arXiv Detail & Related papers (2022-10-12T06:15:51Z) - PercentMatch: Percentile-based Dynamic Thresholding for Multi-Label
Semi-Supervised Classification [64.39761523935613]
We propose a percentile-based threshold adjusting scheme to dynamically alter the score thresholds of positive and negative pseudo-labels for each class during the training.
We achieve strong performance on Pascal VOC2007 and MS-COCO datasets when compared to recent SSL methods.
arXiv Detail & Related papers (2022-08-30T01:27:48Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z) - Self-Supervised Class Incremental Learning [51.62542103481908]
Existing Class Incremental Learning (CIL) methods are based on a supervised classification framework sensitive to data labels.
When updating them based on the new class data, they suffer from catastrophic forgetting: the model cannot discern old class data clearly from the new.
In this paper, we explore the performance of Self-Supervised representation learning in Class Incremental Learning (SSCIL) for the first time.
arXiv Detail & Related papers (2021-11-18T06:58:19Z) - Flip Learning: Erase to Segment [65.84901344260277]
Weakly-supervised segmentation (WSS) can help reduce time-consuming and cumbersome manual annotation.
We propose a novel and general WSS framework called Flip Learning, which only needs the box annotation.
Our proposed approach achieves competitive performance and shows great potential to narrow the gap between fully-supervised and weakly-supervised learning.
arXiv Detail & Related papers (2021-08-02T09:56:10Z) - Continual Semantic Segmentation via Repulsion-Attraction of Sparse and
Disentangled Latent Representations [18.655840060559168]
This paper focuses on class incremental continual learning in semantic segmentation.
New categories are made available over time while previous training data is not retained.
The proposed continual learning scheme shapes the latent space to reduce forgetting whilst improving the recognition of novel classes.
arXiv Detail & Related papers (2021-03-10T21:02:05Z) - Prior Guided Feature Enrichment Network for Few-Shot Segmentation [64.91560451900125]
State-of-the-art semantic segmentation methods require sufficient labeled data to achieve good results.
Few-shot segmentation is proposed to tackle this problem by learning a model that quickly adapts to new classes with a few labeled support samples.
Theses frameworks still face the challenge of generalization ability reduction on unseen classes due to inappropriate use of high-level semantic information.
arXiv Detail & Related papers (2020-08-04T10:41:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.