Evolving Knowledge Mining for Class Incremental Segmentation
- URL: http://arxiv.org/abs/2306.02027v2
- Date: Tue, 28 Nov 2023 06:34:14 GMT
- Title: Evolving Knowledge Mining for Class Incremental Segmentation
- Authors: Zhihe Lu, Shuicheng Yan, Xinchao Wang
- Abstract summary: Class Incremental Semantic (CISS) has been a trend recently due to its great significance in real-world applications.
We propose a novel method, Evolving kNowleDge minING, employing a frozen backbone.
We evaluate our method on two widely used benchmarks and consistently demonstrate new state-of-the-art performance.
- Score: 113.59611699693092
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Class Incremental Semantic Segmentation (CISS) has been a trend recently due
to its great significance in real-world applications. Although the existing
CISS methods demonstrate remarkable performance, they either leverage the
high-level knowledge (feature) only while neglecting the rich and diverse
knowledge in the low-level features, leading to poor old knowledge preservation
and weak new knowledge exploration; or use multi-level features for knowledge
distillation by retraining a heavy backbone, which is computationally
intensive. In this paper, we for the first time investigate the efficient
multi-grained knowledge reuse for CISS, and propose a novel method, Evolving
kNowleDge minING (ENDING), employing a frozen backbone. ENDING incorporates two
key modules: evolving fusion and semantic enhancement, for dynamic and
comprehensive exploration of multi-grained knowledge. Evolving fusion is
tailored to extract knowledge from individual low-level feature using a
personalized lightweight network, which is generated from a meta-net, taking
the high-level feature as input. This design enables the evolution of knowledge
mining and fusing when applied to incremental new classes. In contrast,
semantic enhancement is specifically crafted to aggregate prototype-based
semantics from multi-level features, contributing to an enhanced
representation. We evaluate our method on two widely used benchmarks and
consistently demonstrate new state-of-the-art performance. The code is
available at https://github.com/zhiheLu/ENDING_ISS.
Related papers
- Knowledge Adaptation Network for Few-Shot Class-Incremental Learning [23.90555521006653]
Few-shot class-incremental learning aims to incrementally recognize new classes using a few samples.
One of the effective methods to solve this challenge is to construct prototypical evolution classifiers.
Because representations for new classes are weak and biased, we argue such a strategy is suboptimal.
arXiv Detail & Related papers (2024-09-18T07:51:38Z) - FecTek: Enhancing Term Weight in Lexicon-Based Retrieval with Feature Context and Term-level Knowledge [54.61068946420894]
We introduce an innovative method by introducing FEature Context and TErm-level Knowledge modules.
To effectively enrich the feature context representations of term weight, the Feature Context Module (FCM) is introduced.
We also develop a term-level knowledge guidance module (TKGM) for effectively utilizing term-level knowledge to intelligently guide the modeling process of term weight.
arXiv Detail & Related papers (2024-04-18T12:58:36Z) - Learning Prompt with Distribution-Based Feature Replay for Few-Shot Class-Incremental Learning [56.29097276129473]
We propose a simple yet effective framework, named Learning Prompt with Distribution-based Feature Replay (LP-DiF)
To prevent the learnable prompt from forgetting old knowledge in the new session, we propose a pseudo-feature replay approach.
When progressing to a new session, pseudo-features are sampled from old-class distributions combined with training images of the current session to optimize the prompt.
arXiv Detail & Related papers (2024-01-03T07:59:17Z) - MultIOD: Rehearsal-free Multihead Incremental Object Detector [17.236182938227163]
We propose MultIOD, a class-incremental object detector based on CenterNet.
We employ transfer learning between classes learned initially and those learned incrementally to tackle catastrophic forgetting.
Results show that our method outperforms state-of-the-art methods on two Pascal VOC datasets.
arXiv Detail & Related papers (2023-09-11T09:32:45Z) - Leveraging Old Knowledge to Continually Learn New Classes in Medical
Images [16.730335437094592]
We focus on how old knowledge can be leveraged to learn new classes without catastrophic forgetting.
Our solution is able to achieve superior performance over state-of-the-art baselines in terms of class accuracy and forgetting.
arXiv Detail & Related papers (2023-03-24T02:10:53Z) - SHELS: Exclusive Feature Sets for Novelty Detection and Continual
Learning Without Class Boundaries [22.04165296584446]
We introduce a Sparse High-level-Exclusive, Low-level-Shared feature representation (SHELS)
SHELS encourages learning exclusive sets of high-level features and essential, shared low-level features.
We show that using SHELS for novelty detection results in statistically significant improvements over state-of-the-art OOD detection approaches.
arXiv Detail & Related papers (2022-06-28T03:09:55Z) - Energy-based Latent Aligner for Incremental Learning [83.0135278697976]
Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks.
This behavior emerges because the parameter updates optimized for the new tasks may not align well with the updates suitable for older tasks.
We propose ELI: Energy-based Latent Aligner for Incremental Learning.
arXiv Detail & Related papers (2022-03-28T17:57:25Z) - Preserving Earlier Knowledge in Continual Learning with the Help of All
Previous Feature Extractors [63.21036904487014]
Continual learning of new knowledge over time is one desirable capability for intelligent systems to recognize more and more classes of objects.
We propose a simple yet effective fusion mechanism by including all the previously learned feature extractors into the intelligent model.
Experiments on multiple classification tasks show that the proposed approach can effectively reduce the forgetting of old knowledge, achieving state-of-the-art continual learning performance.
arXiv Detail & Related papers (2021-04-28T07:49:24Z) - Towards a Universal Continuous Knowledge Base [49.95342223987143]
We propose a method for building a continuous knowledge base that can store knowledge imported from multiple neural networks.
Experiments on text classification show promising results.
We import the knowledge from multiple models to the knowledge base, from which the fused knowledge is exported back to a single model.
arXiv Detail & Related papers (2020-12-25T12:27:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.