PLOP: Learning without Forgetting for Continual Semantic Segmentation
- URL: http://arxiv.org/abs/2011.11390v3
- Date: Thu, 11 Mar 2021 09:43:29 GMT
- Title: PLOP: Learning without Forgetting for Continual Semantic Segmentation
- Authors: Arthur Douillard and Yifu Chen and Arnaud Dapogny and Matthieu Cord
- Abstract summary: Continual learning for semantic segmentation (CSS) is an emerging trend that consists in updating an old model by sequentially adding new classes.
In this paper, we propose Local POD, a multi-scale pooling distillation scheme that preserves long- and short-range spatial relationships at feature level.
We also design an entropy-based pseudo-labelling of the background w.r.t. classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes.
- Score: 44.49799311137856
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning approaches are nowadays ubiquitously used to tackle computer
vision tasks such as semantic segmentation, requiring large datasets and
substantial computational power. Continual learning for semantic segmentation
(CSS) is an emerging trend that consists in updating an old model by
sequentially adding new classes. However, continual learning methods are
usually prone to catastrophic forgetting. This issue is further aggravated in
CSS where, at each step, old classes from previous iterations are collapsed
into the background. In this paper, we propose Local POD, a multi-scale pooling
distillation scheme that preserves long- and short-range spatial relationships
at feature level. Furthermore, we design an entropy-based pseudo-labelling of
the background w.r.t. classes predicted by the old model to deal with
background shift and avoid catastrophic forgetting of the old classes. Our
approach, called PLOP, significantly outperforms state-of-the-art methods in
existing CSS scenarios, as well as in newly proposed challenging benchmarks.
Related papers
- PASS++: A Dual Bias Reduction Framework for Non-Exemplar Class-Incremental Learning [49.240408681098906]
Class-incremental learning (CIL) aims to recognize new classes incrementally while maintaining the discriminability of old classes.
Most existing CIL methods are exemplar-based, i.e., storing a part of old data for retraining.
We present a simple and novel dual bias reduction framework that employs self-supervised transformation (SST) in input space and prototype augmentation (protoAug) in deep feature space.
arXiv Detail & Related papers (2024-07-19T05:03:16Z) - BACS: Background Aware Continual Semantic Segmentation [15.821935479975343]
In autonomous driving, there's a need to incorporate new classes as the operating environment of the deployed agent becomes more complex.
For enhanced annotation efficiency, ideally, only pixels belonging to new classes would be annotated.
This paper proposes a Backward Background Shift Detector (BACS) to detect previously observed classes.
arXiv Detail & Related papers (2024-04-19T19:25:26Z) - Tendency-driven Mutual Exclusivity for Weakly Supervised Incremental Semantic Segmentation [56.1776710527814]
Weakly Incremental Learning for Semantic (WILSS) leverages a pre-trained segmentation model to segment new classes using cost-effective and readily available image-level labels.
A prevailing way to solve WILSS is the generation of seed areas for each new class, serving as a form of pixel-level supervision.
We propose an innovative, tendency-driven relationship of mutual exclusivity, meticulously tailored to govern the behavior of the seed areas.
arXiv Detail & Related papers (2024-04-18T08:23:24Z) - ECLIPSE: Efficient Continual Learning in Panoptic Segmentation with Visual Prompt Tuning [54.68180752416519]
Panoptic segmentation is a cutting-edge computer vision task.
We introduce a novel and efficient method for continual panoptic segmentation based on Visual Prompt Tuning, dubbed ECLIPSE.
Our approach involves freezing the base model parameters and fine-tuning only a small set of prompt embeddings, addressing both catastrophic forgetting and plasticity.
arXiv Detail & Related papers (2024-03-29T11:31:12Z) - Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning [65.57123249246358]
We propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
We train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces.
Our prototype complement strategy synthesizes old classes' new features without using any old class instance.
arXiv Detail & Related papers (2024-03-18T17:58:13Z) - Modeling the Background for Incremental and Weakly-Supervised Semantic
Segmentation [39.025848280224785]
We introduce a novel incremental class learning approach for semantic segmentation.
Since each training step provides annotation only for a subset of all possible classes, pixels of the background class exhibit a semantic shift.
We demonstrate the effectiveness of our approach with an extensive evaluation on the Pascal-VOC, ADE20K, and Cityscapes datasets.
arXiv Detail & Related papers (2022-01-31T16:33:21Z) - Tackling Catastrophic Forgetting and Background Shift in Continual
Semantic Segmentation [35.2461834832935]
Continual learning for semantic segmentation (CSS) is an emerging trend that consists in updating an old model by sequentially adding new classes.
In this paper, we propose Local POD, a multi-scale pooling distillation scheme that preserves long- and short-range spatial relationships.
We also introduce a novel rehearsal method that is particularly suited for segmentation.
arXiv Detail & Related papers (2021-06-29T11:57:21Z) - Continual Semantic Segmentation via Repulsion-Attraction of Sparse and
Disentangled Latent Representations [18.655840060559168]
This paper focuses on class incremental continual learning in semantic segmentation.
New categories are made available over time while previous training data is not retained.
The proposed continual learning scheme shapes the latent space to reduce forgetting whilst improving the recognition of novel classes.
arXiv Detail & Related papers (2021-03-10T21:02:05Z) - Context Decoupling Augmentation for Weakly Supervised Semantic
Segmentation [53.49821324597837]
Weakly supervised semantic segmentation is a challenging problem that has been deeply studied in recent years.
We present a Context Decoupling Augmentation ( CDA) method to change the inherent context in which the objects appear.
To validate the effectiveness of the proposed method, extensive experiments on PASCAL VOC 2012 dataset with several alternative network architectures demonstrate that CDA can boost various popular WSSS methods to the new state-of-the-art by a large margin.
arXiv Detail & Related papers (2021-03-02T15:05:09Z) - Modeling the Background for Incremental Learning in Semantic
Segmentation [39.025848280224785]
Deep architectures are vulnerable to catastrophic forgetting.
This paper addresses this problem in the context of semantic segmentation.
We propose a new distillation-based framework which explicitly accounts for this shift.
arXiv Detail & Related papers (2020-02-03T13:30:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.