Federated Class-Incremental Learning
- URL: http://arxiv.org/abs/2203.11473v1
- Date: Tue, 22 Mar 2022 05:58:44 GMT
- Title: Federated Class-Incremental Learning
- Authors: Jiahua Dong, Lixu Wang, Zhen Fang, Gan Sun, Shichao Xu, Xiao Wang, Qi
Zhu
- Abstract summary: Federated learning (FL) has attracted growing attention via data-private collaborative training on decentralized clients.
Most existing methods unrealistically assume object classes of the overall framework are fixed over time.
We develop a novel Global-Local Forgetting Compensation (GLFC) model to learn a global class incremental model.
- Score: 32.676616274338734
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Federated learning (FL) has attracted growing attention via data-private
collaborative training on decentralized clients. However, most existing methods
unrealistically assume object classes of the overall framework are fixed over
time. It makes the global model suffer from significant catastrophic forgetting
on old classes in real-world scenarios, where local clients often collect new
classes continuously and have very limited storage memory to store old classes.
Moreover, new clients with unseen new classes may participate in the FL
training, further aggravating the catastrophic forgetting of the global model.
To address these challenges, we develop a novel Global-Local Forgetting
Compensation (GLFC) model, to learn a global class incremental model for
alleviating the catastrophic forgetting from both local and global
perspectives. Specifically, to address local forgetting caused by class
imbalance at the local clients, we design a class-aware gradient compensation
loss and a class-semantic relation distillation loss to balance the forgetting
of old classes and distill consistent inter-class relations across tasks. To
tackle the global forgetting brought by the non-i.i.d class imbalance across
clients, we propose a proxy server that selects the best old global model to
assist the local relation distillation. Moreover, a prototype gradient-based
communication mechanism is developed to protect privacy. Our model outperforms
state-of-the-art methods by 4.4%-15.1% in terms of average accuracy on
representative benchmark datasets.
Related papers
- Recovering Global Data Distribution Locally in Federated Learning [7.885010255812708]
Federated Learning (FL) is a distributed machine learning paradigm that enables collaboration among multiple clients.
A major challenge in FL is the label imbalance, where clients may exclusively possess certain classes while having numerous minority and missing classes.
We propose a novel approach ReGL to address this challenge, whose key idea is to Recover the Global data distribution Locally.
arXiv Detail & Related papers (2024-09-21T08:35:04Z) - Regularizing and Aggregating Clients with Class Distribution for Personalized Federated Learning [0.8287206589886879]
Class-wise Federated Averaging (cwFedAVG) class-wise, creating multiple global models per class on the server.
Each local model integrates these global models weighted by its estimated local class distribution, derived from the L2-norms of deep network weights.
We also newly designed Weight Distribution Regularizer (WDR) to further enhance the accuracy of estimating a local class distribution.
arXiv Detail & Related papers (2024-06-12T01:32:24Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - No One Left Behind: Real-World Federated Class-Incremental Learning [111.77681016996202]
Local-Global Anti-forgetting (LGA) model addresses local and global catastrophic forgetting.
We develop a category-balanced gradient-adaptive compensation loss and a category gradient-induced semantic distillation loss.
It augments perturbed prototype images of new categories collected from local clients via self-supervised prototype augmentation.
arXiv Detail & Related papers (2023-02-02T06:41:02Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Tackling Data Heterogeneity in Federated Learning with Class Prototypes [44.746340839025194]
We propose FedNH, a novel method that improves the local models' performance for both personalization and generalization.
We show that imposing uniformity helps to combat prototype collapse while infusing class semantics improves local models.
arXiv Detail & Related papers (2022-12-06T05:15:38Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z) - Think Locally, Act Globally: Federated Learning with Local and Global
Representations [92.68484710504666]
Federated learning is a method of training models on private data distributed over multiple devices.
We propose a new federated learning algorithm that jointly learns compact local representations on each device.
We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key.
arXiv Detail & Related papers (2020-01-06T12:40:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.