Attribution-aware Weight Transfer: A Warm-Start Initialization for
Class-Incremental Semantic Segmentation
- URL: http://arxiv.org/abs/2210.07207v1
- Date: Thu, 13 Oct 2022 17:32:12 GMT
- Title: Attribution-aware Weight Transfer: A Warm-Start Initialization for
Class-Incremental Semantic Segmentation
- Authors: Dipam Goswami, Ren\'e Schuster, Joost van de Weijer, Didier Stricker
- Abstract summary: In class-incremental semantic segmentation (CISS), deep learning architectures suffer from the critical problems of catastrophic forgetting and semantic background shift.
We propose a novel method which employs gradient-based attribution to identify the most relevant weights for new classes.
Our experiments demonstrate significant improvement in mIoU compared to the state-of-the-art CISS methods on the Pascal-VOC 2012, ADE20K and Cityscapes datasets.
- Score: 38.52441363934223
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In class-incremental semantic segmentation (CISS), deep learning
architectures suffer from the critical problems of catastrophic forgetting and
semantic background shift. Although recent works focused on these issues,
existing classifier initialization methods do not address the background shift
problem and assign the same initialization weights to both background and new
foreground class classifiers. We propose to address the background shift with a
novel classifier initialization method which employs gradient-based attribution
to identify the most relevant weights for new classes from the classifier's
weights for the previous background and transfers these weights to the new
classifier. This warm-start weight initialization provides a general solution
applicable to several CISS methods. Furthermore, it accelerates learning of new
classes while mitigating forgetting. Our experiments demonstrate significant
improvement in mIoU compared to the state-of-the-art CISS methods on the
Pascal-VOC 2012, ADE20K and Cityscapes datasets.
Related papers
- Early Preparation Pays Off: New Classifier Pre-tuning for Class Incremental Semantic Segmentation [13.62129805799111]
Class incremental semantic segmentation aims to preserve old knowledge while learning new tasks.
It is impeded by catastrophic forgetting and background shift issues.
We propose a new classifier pre-tuning(NeST) method applied before the formal training process.
arXiv Detail & Related papers (2024-07-19T09:19:29Z) - PASS++: A Dual Bias Reduction Framework for Non-Exemplar Class-Incremental Learning [49.240408681098906]
Class-incremental learning (CIL) aims to recognize new classes incrementally while maintaining the discriminability of old classes.
Most existing CIL methods are exemplar-based, i.e., storing a part of old data for retraining.
We present a simple and novel dual bias reduction framework that employs self-supervised transformation (SST) in input space and prototype augmentation (protoAug) in deep feature space.
arXiv Detail & Related papers (2024-07-19T05:03:16Z) - Mitigating Background Shift in Class-Incremental Semantic Segmentation [18.604420743751643]
Class-Incremental Semantic(CISS) aims to learn new classes without forgetting the old ones.
We propose a background-class separation framework for CISS.
arXiv Detail & Related papers (2024-07-16T15:44:37Z) - Mining Unseen Classes via Regional Objectness: A Simple Baseline for
Incremental Segmentation [57.80416375466496]
Incremental or continual learning has been extensively studied for image classification tasks to alleviate catastrophic forgetting.
We propose a simple yet effective method in this paper, named unseen Classes via Regional Objectness for Mining (MicroSeg)
Our MicroSeg is based on the assumption that background regions with strong objectness possibly belong to those concepts in the historical or future stages.
In this way, the distribution characterizes of old concepts in the feature space could be better perceived, relieving the catastrophic forgetting caused by the background shift accordingly.
arXiv Detail & Related papers (2022-11-13T10:06:17Z) - Pre-Train Your Loss: Easy Bayesian Transfer Learning with Informative
Priors [59.93972277761501]
We show that we can learn highly informative posteriors from the source task, through supervised or self-supervised approaches.
This simple modular approach enables significant performance gains and more data-efficient learning on a variety of downstream classification and segmentation tasks.
arXiv Detail & Related papers (2022-05-20T16:19:30Z) - Modeling the Background for Incremental and Weakly-Supervised Semantic
Segmentation [39.025848280224785]
We introduce a novel incremental class learning approach for semantic segmentation.
Since each training step provides annotation only for a subset of all possible classes, pixels of the background class exhibit a semantic shift.
We demonstrate the effectiveness of our approach with an extensive evaluation on the Pascal-VOC, ADE20K, and Cityscapes datasets.
arXiv Detail & Related papers (2022-01-31T16:33:21Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z) - SSUL: Semantic Segmentation with Unknown Label for Exemplar-based
Class-Incremental Learning [19.152041362805985]
We consider a class-incremental semantic segmentation (CISS) problem.
We propose a new method, dubbed as SSUL-M (Semantic with Unknown Label with Memory), by carefully combining several techniques tailored for semantic segmentation.
We show our method achieves significantly better performance than the recent state-of-the-art baselines on the standard benchmark datasets.
arXiv Detail & Related papers (2021-06-22T06:40:26Z) - Anti-aliasing Semantic Reconstruction for Few-Shot Semantic Segmentation [66.85202434812942]
We reformulate few-shot segmentation as a semantic reconstruction problem.
We convert base class features into a series of basis vectors which span a class-level semantic space for novel class reconstruction.
Our proposed approach, referred to as anti-aliasing semantic reconstruction (ASR), provides a systematic yet interpretable solution for few-shot learning problems.
arXiv Detail & Related papers (2021-06-01T02:17:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.