CoMBO: Conflict Mitigation via Branched Optimization for Class Incremental Segmentation
- URL: http://arxiv.org/abs/2504.04156v1
- Date: Sat, 05 Apr 2025 12:34:51 GMT
- Title: CoMBO: Conflict Mitigation via Branched Optimization for Class Incremental Segmentation
- Authors: Kai Fang, Anqi Zhang, Guangyu Gao, Jianbo Jiao, Chi Harold Liu, Yunchao Wei,
- Abstract summary: Effective Class Incremental (CIS) requires simultaneously mitigating catastrophic forgetting and ensuring sufficient plasticity to integrate new classes.<n>We introduce a novel approach, Conflict Mitigation via Branched Optimization(CoMBO)<n>Within this approach, we present the Query Conflict Reduction module, designed to explicitly refine queries for new classes.
- Score: 75.35841972192684
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Effective Class Incremental Segmentation (CIS) requires simultaneously mitigating catastrophic forgetting and ensuring sufficient plasticity to integrate new classes. The inherent conflict above often leads to a back-and-forth, which turns the objective into finding the balance between the performance of previous~(old) and incremental~(new) classes. To address this conflict, we introduce a novel approach, Conflict Mitigation via Branched Optimization~(CoMBO). Within this approach, we present the Query Conflict Reduction module, designed to explicitly refine queries for new classes through lightweight, class-specific adapters. This module provides an additional branch for the acquisition of new classes while preserving the original queries for distillation. Moreover, we develop two strategies to further mitigate the conflict following the branched structure, \textit{i.e.}, the Half-Learning Half-Distillation~(HDHL) over classification probabilities, and the Importance-Based Knowledge Distillation~(IKD) over query features. HDHL selectively engages in learning for classification probabilities of queries that match the ground truth of new classes, while aligning unmatched ones to the corresponding old probabilities, thus ensuring retention of old knowledge while absorbing new classes via learning negative samples. Meanwhile, IKD assesses the importance of queries based on their matching degree to old classes, prioritizing the distillation of important features and allowing less critical features to evolve. Extensive experiments in Class Incremental Panoptic and Semantic Segmentation settings have demonstrated the superior performance of CoMBO. Project page: https://guangyu-ryan.github.io/CoMBO.
Related papers
- Self-Classification Enhancement and Correction for Weakly Supervised Object Detection [113.51483527300496]
weakly supervised object detection (WSOD) has attracted much attention due to its low labeling cost.<n>In this work, we introduce a novel WSOD framework to ameliorate these two issues.<n>For one thing, we propose a self-classification enhancement module that integrates intra-class binary classification (ICBC) to bridge the gap between the two distinct MCC tasks.<n>For another, we propose a self-classification correction algorithm during inference, which combines the results of both MCC tasks to effectively reduce the mis-classified predictions.
arXiv Detail & Related papers (2025-05-22T06:45:58Z) - Adaptive Weighted Parameter Fusion with CLIP for Class-Incremental Learning [12.67816343247008]
Class-incremental Learning enables the model to incrementally absorb knowledge from new classes.<n>When the model optimize with new classes, the knowledge of previous classes is inevitably erased, leading to catastrophic forgetting.
arXiv Detail & Related papers (2025-03-25T09:51:04Z) - I2CANSAY:Inter-Class Analogical Augmentation and Intra-Class Significance Analysis for Non-Exemplar Online Task-Free Continual Learning [42.608860809847236]
Online task-free continual learning (OTFCL) is a more challenging variant of continual learning.
Existing methods rely on a memory buffer composed of old samples to prevent forgetting.
We propose a novel framework called I2CANSAY that gets rid of the dependence on memory buffers and efficiently learns the knowledge of new data from one-shot samples.
arXiv Detail & Related papers (2024-04-21T08:28:52Z) - Tendency-driven Mutual Exclusivity for Weakly Supervised Incremental Semantic Segmentation [56.1776710527814]
Weakly Incremental Learning for Semantic (WILSS) leverages a pre-trained segmentation model to segment new classes using cost-effective and readily available image-level labels.
A prevailing way to solve WILSS is the generation of seed areas for each new class, serving as a form of pixel-level supervision.
We propose an innovative, tendency-driven relationship of mutual exclusivity, meticulously tailored to govern the behavior of the seed areas.
arXiv Detail & Related papers (2024-04-18T08:23:24Z) - Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning [65.57123249246358]
We propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
We train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces.
Our prototype complement strategy synthesizes old classes' new features without using any old class instance.
arXiv Detail & Related papers (2024-03-18T17:58:13Z) - Balanced Classification: A Unified Framework for Long-Tailed Object
Detection [74.94216414011326]
Conventional detectors suffer from performance degradation when dealing with long-tailed data due to a classification bias towards the majority head categories.
We introduce a unified framework called BAlanced CLassification (BACL), which enables adaptive rectification of inequalities caused by disparities in category distribution.
BACL consistently achieves performance improvements across various datasets with different backbones and architectures.
arXiv Detail & Related papers (2023-08-04T09:11:07Z) - Federated Incremental Semantic Segmentation [42.66387280141536]
Federated learning-based semantic segmentation (FSS) has drawn widespread attention via decentralized training on local clients.
Most FSS models assume categories are fixed in advance, thus heavily undergoing forgetting on old categories in practical applications.
We propose a Forgetting-Balanced Learning model to address heterogeneous forgetting on old classes from both intra-client and inter-client aspects.
arXiv Detail & Related papers (2023-04-10T14:34:23Z) - Conflict-Based Cross-View Consistency for Semi-Supervised Semantic
Segmentation [34.97083511196799]
Semi-supervised semantic segmentation (SSS) has recently gained increasing research interest.
Current methods often suffer from the confirmation bias from the pseudo-labelling process.
We propose a new conflict-based cross-view consistency (CCVC) method based on a two-branch co-training framework.
arXiv Detail & Related papers (2023-03-02T14:02:16Z) - Learning "O" Helps for Learning More: Handling the Concealed Entity
Problem for Class-incremental NER [23.625741716498037]
"Unlabeled Entity Problem" leads to severe confusion between "O" and entities.
We propose an entity-aware contrastive learning method that adaptively detects entity clusters in "O"
We introduce a more realistic and challenging benchmark for class-incremental NER.
arXiv Detail & Related papers (2022-10-10T13:26:45Z) - Incremental Few-Shot Learning via Implanting and Compressing [13.122771115838523]
Incremental Few-Shot Learning requires a model to continually learn novel classes from only a few examples.
We propose a two-step learning strategy referred to as textbfImplanting and textbfCompressing.
Specifically, in the textbfImplanting step, we propose to mimic the data distribution of novel classes with the assistance of data-abundant base set.
In the textbf step, we adapt the feature extractor to precisely represent each novel class for enhancing intra-class compactness.
arXiv Detail & Related papers (2022-03-19T11:04:43Z) - Learning What Not to Segment: A New Perspective on Few-Shot Segmentation [63.910211095033596]
Recently few-shot segmentation (FSS) has been extensively developed.
This paper proposes a fresh and straightforward insight to alleviate the problem.
In light of the unique nature of the proposed approach, we also extend it to a more realistic but challenging setting.
arXiv Detail & Related papers (2022-03-15T03:08:27Z) - Few-Shot Object Detection via Association and DIscrimination [83.8472428718097]
Few-shot object detection via Association and DIscrimination builds up a discriminative feature space for each novel class with two integral steps.
Experiments on Pascal VOC and MS-COCO datasets demonstrate FADI achieves new SOTA performance, significantly improving the baseline in any shot/split by +18.7.
arXiv Detail & Related papers (2021-11-23T05:04:06Z) - Revisiting Deep Local Descriptor for Improved Few-Shot Classification [56.74552164206737]
We show how one can improve the quality of embeddings by leveraging textbfDense textbfClassification and textbfAttentive textbfPooling.
We suggest to pool feature maps by applying attentive pooling instead of the widely used global average pooling (GAP) to prepare embeddings for few-shot classification.
arXiv Detail & Related papers (2021-03-30T00:48:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.