No One Left Behind: Real-World Federated Class-Incremental Learning
- URL: http://arxiv.org/abs/2302.00903v3
- Date: Thu, 16 Nov 2023 03:53:45 GMT
- Title: No One Left Behind: Real-World Federated Class-Incremental Learning
- Authors: Jiahua Dong, Hongliu Li, Yang Cong, Gan Sun, Yulun Zhang, Luc Van Gool
- Abstract summary: Local-Global Anti-forgetting (LGA) model addresses local and global catastrophic forgetting.
We develop a category-balanced gradient-adaptive compensation loss and a category gradient-induced semantic distillation loss.
It augments perturbed prototype images of new categories collected from local clients via self-supervised prototype augmentation.
- Score: 111.77681016996202
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is a hot collaborative training framework via
aggregating model parameters of decentralized local clients. However, most FL
methods unreasonably assume data categories of FL framework are known and fixed
in advance. Moreover, some new local clients that collect novel categories
unseen by other clients may be introduced to FL training irregularly. These
issues render global model to undergo catastrophic forgetting on old
categories, when local clients receive new categories consecutively under
limited memory of storing old categories. To tackle the above issues, we
propose a novel Local-Global Anti-forgetting (LGA) model. It ensures no local
clients are left behind as they learn new classes continually, by addressing
local and global catastrophic forgetting. Specifically, considering tackling
class imbalance of local client to surmount local forgetting, we develop a
category-balanced gradient-adaptive compensation loss and a category
gradient-induced semantic distillation loss. They can balance heterogeneous
forgetting speeds of hard-to-forget and easy-to-forget old categories, while
ensure consistent class-relations within different tasks. Moreover, a proxy
server is designed to tackle global forgetting caused by Non-IID class
imbalance between different clients. It augments perturbed prototype images of
new categories collected from local clients via self-supervised prototype
augmentation, thus improving robustness to choose the best old global model for
local-side semantic distillation loss. Experiments on representative datasets
verify superior performance of our model against comparison methods. The code
is available at https://github.com/JiahuaDong/LGA.
Related papers
- Regularizing and Aggregating Clients with Class Distribution for Personalized Federated Learning [0.8287206589886879]
Class-wise Federated Averaging (cwFedAVG) class-wise, creating multiple global models per class on the server.
Each local model integrates these global models weighted by its estimated local class distribution, derived from the L2-norms of deep network weights.
We also newly designed Weight Distribution Regularizer (WDR) to further enhance the accuracy of estimating a local class distribution.
arXiv Detail & Related papers (2024-06-12T01:32:24Z) - Federated Skewed Label Learning with Logits Fusion [23.062650578266837]
Federated learning (FL) aims to collaboratively train a shared model across multiple clients without transmitting their local data.
We propose FedBalance, which corrects the optimization bias among local models by calibrating their logits.
Our method can gain 13% higher average accuracy compared with state-of-the-art methods.
arXiv Detail & Related papers (2023-11-14T14:37:33Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - Federated Incremental Semantic Segmentation [42.66387280141536]
Federated learning-based semantic segmentation (FSS) has drawn widespread attention via decentralized training on local clients.
Most FSS models assume categories are fixed in advance, thus heavily undergoing forgetting on old categories in practical applications.
We propose a Forgetting-Balanced Learning model to address heterogeneous forgetting on old classes from both intra-client and inter-client aspects.
arXiv Detail & Related papers (2023-04-10T14:34:23Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Federated Class-Incremental Learning [32.676616274338734]
Federated learning (FL) has attracted growing attention via data-private collaborative training on decentralized clients.
Most existing methods unrealistically assume object classes of the overall framework are fixed over time.
We develop a novel Global-Local Forgetting Compensation (GLFC) model to learn a global class incremental model.
arXiv Detail & Related papers (2022-03-22T05:58:44Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.