Fairness Continual Learning Approach to Semantic Scene Understanding in
Open-World Environments
- URL: http://arxiv.org/abs/2305.15700v4
- Date: Sun, 1 Oct 2023 19:03:09 GMT
- Title: Fairness Continual Learning Approach to Semantic Scene Understanding in
Open-World Environments
- Authors: Thanh-Dat Truong, Hoang-Quan Nguyen, Bhiksha Raj, Khoa Luu
- Abstract summary: We present a novel Fairness Continual Learning approach to the semantic segmentation problem.
Under the fairness objective, a new fairness continual learning framework is proposed based on class distributions.
A novel Prototypical Contrastive Clustering loss is proposed to address the significant challenges in continual learning.
- Score: 33.78036038343624
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Continual semantic segmentation aims to learn new classes while maintaining
the information from the previous classes. Although prior studies have shown
impressive progress in recent years, the fairness concern in the continual
semantic segmentation needs to be better addressed. Meanwhile, fairness is one
of the most vital factors in deploying the deep learning model, especially in
human-related or safety applications. In this paper, we present a novel
Fairness Continual Learning approach to the semantic segmentation problem. In
particular, under the fairness objective, a new fairness continual learning
framework is proposed based on class distributions. Then, a novel Prototypical
Contrastive Clustering loss is proposed to address the significant challenges
in continual learning, i.e., catastrophic forgetting and background shift. Our
proposed loss has also been proven as a novel, generalized learning paradigm of
knowledge distillation commonly used in continual learning. Moreover, the
proposed Conditional Structural Consistency loss further regularized the
structural constraint of the predicted segmentation. Our proposed approach has
achieved State-of-the-Art performance on three standard scene understanding
benchmarks, i.e., ADE20K, Cityscapes, and Pascal VOC, and promoted the fairness
of the segmentation model.
Related papers
- Tendency-driven Mutual Exclusivity for Weakly Supervised Incremental Semantic Segmentation [56.1776710527814]
Weakly Incremental Learning for Semantic (WILSS) leverages a pre-trained segmentation model to segment new classes using cost-effective and readily available image-level labels.
A prevailing way to solve WILSS is the generation of seed areas for each new class, serving as a form of pixel-level supervision.
We propose an innovative, tendency-driven relationship of mutual exclusivity, meticulously tailored to govern the behavior of the seed areas.
arXiv Detail & Related papers (2024-04-18T08:23:24Z) - FALCON: Fairness Learning via Contrastive Attention Approach to Continual Semantic Scene Understanding [28.880226459932146]
This paper presents a novel Fairness Learning via Contrastive Attention Approach to continual learning in semantic scene understanding.
We first introduce a new Fairness Contrastive Clustering loss to address the problems of catastrophic forgetting and fairness.
Then, we propose an attention-based visual grammar approach to effectively model the background shift problem and unknown classes.
arXiv Detail & Related papers (2023-11-27T16:07:39Z) - ICICLE: Interpretable Class Incremental Continual Learning [35.105786309067895]
Interpretable Class-InCremental LEarning (ICICLE) is an exemplar-free approach that adopts a prototypical part-based approach.
Our experimental results demonstrate that ICICLE reduces the interpretability concept drift and outperforms the existing exemplar-free methods of common class-incremental learning.
arXiv Detail & Related papers (2023-03-14T11:31:45Z) - Activating the Discriminability of Novel Classes for Few-shot
Segmentation [48.542627940781095]
We propose to activate the discriminability of novel classes explicitly in both the feature encoding stage and the prediction stage for segmentation.
In the prediction stage for segmentation, we learn an Self-Refined Online Foreground-Background classifier (SROFB), which is able to refine itself using the high-confidence pixels of query image.
arXiv Detail & Related papers (2022-12-02T12:22:36Z) - Mining Unseen Classes via Regional Objectness: A Simple Baseline for
Incremental Segmentation [57.80416375466496]
Incremental or continual learning has been extensively studied for image classification tasks to alleviate catastrophic forgetting.
We propose a simple yet effective method in this paper, named unseen Classes via Regional Objectness for Mining (MicroSeg)
Our MicroSeg is based on the assumption that background regions with strong objectness possibly belong to those concepts in the historical or future stages.
In this way, the distribution characterizes of old concepts in the feature space could be better perceived, relieving the catastrophic forgetting caused by the background shift accordingly.
arXiv Detail & Related papers (2022-11-13T10:06:17Z) - Learning What Not to Segment: A New Perspective on Few-Shot Segmentation [63.910211095033596]
Recently few-shot segmentation (FSS) has been extensively developed.
This paper proposes a fresh and straightforward insight to alleviate the problem.
In light of the unique nature of the proposed approach, we also extend it to a more realistic but challenging setting.
arXiv Detail & Related papers (2022-03-15T03:08:27Z) - Continual Attentive Fusion for Incremental Learning in Semantic
Segmentation [43.98082955427662]
Deep architectures trained with gradient-based techniques suffer from catastrophic forgetting.
We introduce a novel attentive feature distillation approach to mitigate catastrophic forgetting.
We also introduce a novel strategy to account for the background class in the distillation loss, thus preventing biased predictions.
arXiv Detail & Related papers (2022-02-01T14:38:53Z) - Flip Learning: Erase to Segment [65.84901344260277]
Weakly-supervised segmentation (WSS) can help reduce time-consuming and cumbersome manual annotation.
We propose a novel and general WSS framework called Flip Learning, which only needs the box annotation.
Our proposed approach achieves competitive performance and shows great potential to narrow the gap between fully-supervised and weakly-supervised learning.
arXiv Detail & Related papers (2021-08-02T09:56:10Z) - Continual Semantic Segmentation via Repulsion-Attraction of Sparse and
Disentangled Latent Representations [18.655840060559168]
This paper focuses on class incremental continual learning in semantic segmentation.
New categories are made available over time while previous training data is not retained.
The proposed continual learning scheme shapes the latent space to reduce forgetting whilst improving the recognition of novel classes.
arXiv Detail & Related papers (2021-03-10T21:02:05Z) - Deep Clustering by Semantic Contrastive Learning [67.28140787010447]
We introduce a novel variant called Semantic Contrastive Learning (SCL)
It explores the characteristics of both conventional contrastive learning and deep clustering.
It can amplify the strengths of contrastive learning and deep clustering in a unified approach.
arXiv Detail & Related papers (2021-03-03T20:20:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.