Federated Incremental Semantic Segmentation
- URL: http://arxiv.org/abs/2304.04620v1
- Date: Mon, 10 Apr 2023 14:34:23 GMT
- Title: Federated Incremental Semantic Segmentation
- Authors: Jiahua Dong, Duzhen Zhang, Yang Cong, Wei Cong, Henghui Ding, Dengxin
Dai
- Abstract summary: Federated learning-based semantic segmentation (FSS) has drawn widespread attention via decentralized training on local clients.
Most FSS models assume categories are fixed in advance, thus heavily undergoing forgetting on old categories in practical applications.
We propose a Forgetting-Balanced Learning model to address heterogeneous forgetting on old classes from both intra-client and inter-client aspects.
- Score: 42.66387280141536
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning-based semantic segmentation (FSS) has drawn widespread
attention via decentralized training on local clients. However, most FSS models
assume categories are fixed in advance, thus heavily undergoing forgetting on
old categories in practical applications where local clients receive new
categories incrementally while have no memory storage to access old classes.
Moreover, new clients collecting novel classes may join in the global training
of FSS, which further exacerbates catastrophic forgetting. To surmount the
above challenges, we propose a Forgetting-Balanced Learning (FBL) model to
address heterogeneous forgetting on old classes from both intra-client and
inter-client aspects. Specifically, under the guidance of pseudo labels
generated via adaptive class-balanced pseudo labeling, we develop a
forgetting-balanced semantic compensation loss and a forgetting-balanced
relation consistency loss to rectify intra-client heterogeneous forgetting of
old categories with background shift. It performs balanced gradient propagation
and relation consistency distillation within local clients. Moreover, to tackle
heterogeneous forgetting from inter-client aspect, we propose a task transition
monitor. It can identify new classes under privacy protection and store the
latest old global model for relation distillation. Qualitative experiments
reveal large improvement of our model against comparison methods. The code is
available at https://github.com/JiahuaDong/FISS.
Related papers
- Federated Incremental Named Entity Recognition [38.49410747627772]
Federated Named Entity Recognition (FNER) boosts model training within each local client by aggregating the model updates of decentralized local clients, without sharing their private data.
Existing FNER methods assume fixed entity types and local clients in advance, leading to their ineffectiveness in practical applications.
We propose a Local-Global Forgetting Defense (LGFD) model to overcome these challenges.
arXiv Detail & Related papers (2024-11-18T14:53:53Z) - Tendency-driven Mutual Exclusivity for Weakly Supervised Incremental Semantic Segmentation [56.1776710527814]
Weakly Incremental Learning for Semantic (WILSS) leverages a pre-trained segmentation model to segment new classes using cost-effective and readily available image-level labels.
A prevailing way to solve WILSS is the generation of seed areas for each new class, serving as a form of pixel-level supervision.
We propose an innovative, tendency-driven relationship of mutual exclusivity, meticulously tailored to govern the behavior of the seed areas.
arXiv Detail & Related papers (2024-04-18T08:23:24Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - FedABC: Targeting Fair Competition in Personalized Federated Learning [76.9646903596757]
Federated learning aims to collaboratively train models without accessing their client's local private data.
We propose a novel and generic PFL framework termed Federated Averaging via Binary Classification, dubbed FedABC.
In particular, we adopt the one-vs-all'' training strategy in each client to alleviate the unfair competition between classes.
arXiv Detail & Related papers (2023-02-15T03:42:59Z) - No One Left Behind: Real-World Federated Class-Incremental Learning [111.77681016996202]
Local-Global Anti-forgetting (LGA) model addresses local and global catastrophic forgetting.
We develop a category-balanced gradient-adaptive compensation loss and a category gradient-induced semantic distillation loss.
It augments perturbed prototype images of new categories collected from local clients via self-supervised prototype augmentation.
arXiv Detail & Related papers (2023-02-02T06:41:02Z) - FedFA: Federated Learning with Feature Anchors to Align Features and
Classifiers for Heterogeneous Data [8.677832361022809]
Federated learning allows multiple clients to collaboratively train a model without exchanging their data.
Common solutions involve an auxiliary loss to regularize weight divergence or feature inconsistency during local training.
We propose a novel framework named Federated learning with Feature Anchors (FedFA)
arXiv Detail & Related papers (2022-11-17T02:27:44Z) - Federated Class-Incremental Learning [32.676616274338734]
Federated learning (FL) has attracted growing attention via data-private collaborative training on decentralized clients.
Most existing methods unrealistically assume object classes of the overall framework are fixed over time.
We develop a novel Global-Local Forgetting Compensation (GLFC) model to learn a global class incremental model.
arXiv Detail & Related papers (2022-03-22T05:58:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.