Addressing Spatial-Temporal Data Heterogeneity in Federated Continual Learning via Tail Anchor
- URL: http://arxiv.org/abs/2412.18355v1
- Date: Tue, 24 Dec 2024 11:35:40 GMT
- Title: Addressing Spatial-Temporal Data Heterogeneity in Federated Continual Learning via Tail Anchor
- Authors: Hao Yu, Xin Yang, Le Zhang, Hanlin Gu, Tianrui Li, Lixin Fan, Qiang Yang,
- Abstract summary: Federated continual learning (FCL) allows each client to continually update its knowledge from task streams.<n>We propose Federated Tail Anchor (FedTA) to mix trainable Tail Anchor with the frozen output features to adjust their position in the feature space.<n>FedTA not only outperforms existing FCL methods but also effectively preserves the relative positions of features, remaining unaffected by spatial and temporal changes.
- Score: 24.689188066180463
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Federated continual learning (FCL) allows each client to continually update its knowledge from task streams, enhancing the applicability of federated learning in real-world scenarios. However, FCL needs to address not only spatial data heterogeneity between clients but also temporal data heterogeneity between tasks. In this paper, empirical experiments demonstrate that such input-level heterogeneity significantly affects the model's internal parameters and outputs, leading to severe spatial-temporal catastrophic forgetting of local and previous knowledge. To this end, we propose Federated Tail Anchor (FedTA) to mix trainable Tail Anchor with the frozen output features to adjust their position in the feature space, thereby overcoming parameter-forgetting and output-forgetting. Moreover, three novel components are also included in FedTA: Input Enhancement for improving the performance of pre-trained models on downstream tasks; Selective Input Knowledge Fusion for fusion of heterogeneous local knowledge on the server side; and Best Global Prototype Selection for finding the best anchor point for each class in the feature space. Extensive experiments demonstrate that FedTA not only outperforms existing FCL methods but also effectively preserves the relative positions of features, remaining unaffected by spatial and temporal changes.
Related papers
- FedPall: Prototype-based Adversarial and Collaborative Learning for Federated Learning with Feature Drift [29.2377620193847]
Federated learning (FL) enables collaborative training of a global model in a centralized server with data from multiple parties.<n>We propose FedPall, an FL framework that utilizes prototype-based adversarial learning to unify feature spaces and collaborative learning to reinforce class information within the features.<n> evaluation results on three representative feature-drifted datasets demonstrate FedPall's consistently superior performance in classification with feature-drifted data in the FL scenario.
arXiv Detail & Related papers (2025-07-07T08:58:39Z) - STSA: Federated Class-Incremental Learning via Spatial-Temporal Statistics Aggregation [64.48462746540156]
Federated Class-Incremental Learning (FCIL) enables Class-Incremental Learning from distributed data.<n>We propose a novel approach to aggregate feature statistics both spatially (across clients) and temporally (across stages)<n>We show that our method outperforms state-of-the-art FCIL methods in terms of performance, flexibility, and both communication and efficiency.
arXiv Detail & Related papers (2025-06-02T05:14:57Z) - FedSKC: Federated Learning with Non-IID Data via Structural Knowledge Collaboration [43.25824181502647]
Key idea of FedSKC is to extract and transfer domain preferences from interclient data distributions.<n>FedSKC comprises three components: contrastive learning, global discrepancy aggregation, and global period review.
arXiv Detail & Related papers (2025-05-25T05:24:49Z) - FedLF: Adaptive Logit Adjustment and Feature Optimization in Federated Long-Tailed Learning [5.23984567704876]
Federated learning offers a paradigm to the challenge of preserving privacy in distributed machine learning.
Traditional approach fails to address the phenomenon of class-wise bias in global long-tailed data.
New method FedLF introduces three modifications in the local training phase: adaptive logit adjustment, continuous class centred optimization, and feature decorrelation.
arXiv Detail & Related papers (2024-09-18T16:25:29Z) - Tackling Feature-Classifier Mismatch in Federated Learning via Prompt-Driven Feature Transformation [12.19025665853089]
In traditional Federated Learning approaches, the global model underperforms when faced with data heterogeneity.
We propose a new PFL framework called FedPFT to address the mismatch problem while enhancing the quality of the feature extractor.
Our experiments demonstrate that FedPFT outperforms state-of-the-art methods by up to 7.08%.
arXiv Detail & Related papers (2024-07-23T02:52:52Z) - Federated Learning under Partially Class-Disjoint Data via Manifold Reshaping [64.58402571292723]
We propose a manifold reshaping approach called FedMR to calibrate the feature space of local training.
We conduct extensive experiments on a range of datasets to demonstrate that our FedMR achieves much higher accuracy and better communication efficiency.
arXiv Detail & Related papers (2024-05-29T10:56:13Z) - Decoupled Federated Learning on Long-Tailed and Non-IID data with
Feature Statistics [20.781607752797445]
We propose a two-stage Decoupled Federated learning framework using Feature Statistics (DFL-FS)
In the first stage, the server estimates the client's class coverage distributions through masked local feature statistics clustering.
In the second stage, DFL-FS employs federated feature regeneration based on global feature statistics to enhance the model's adaptability to long-tailed data distributions.
arXiv Detail & Related papers (2024-03-13T09:24:59Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - FedFed: Feature Distillation against Data Heterogeneity in Federated
Learning [88.36513907827552]
Federated learning (FL) typically faces data heterogeneity, i.e., distribution shifting among clients.
We propose a novel approach called textbfFederated textbfFeature textbfdistillation (FedFedFed)
FedFed partitions data into performance-sensitive features (i.e., greatly contributing to model performance) and performance-robust features (i.e., limitedly contributing to model performance)
Comprehensive experiments demonstrate the efficacy of FedFed in promoting model performance.
arXiv Detail & Related papers (2023-10-08T09:00:59Z) - Neural Collapse Inspired Federated Learning with Non-iid Data [31.576588815816095]
Non-independent and identically distributed (non-iid) characteristics cause significant differences in local updates and affect the performance of the central server.
Inspired by the phenomenon of neural collapse, we force each client to be optimized toward an optimal global structure for classification.
Our method can improve the performance with faster convergence speed on different-size datasets.
arXiv Detail & Related papers (2023-03-27T05:29:53Z) - FedFA: Federated Learning with Feature Anchors to Align Features and
Classifiers for Heterogeneous Data [8.677832361022809]
Federated learning allows multiple clients to collaboratively train a model without exchanging their data.
Common solutions involve an auxiliary loss to regularize weight divergence or feature inconsistency during local training.
We propose a novel framework named Federated learning with Feature Anchors (FedFA)
arXiv Detail & Related papers (2022-11-17T02:27:44Z) - FedFM: Anchor-based Feature Matching for Data Heterogeneity in Federated
Learning [91.74206675452888]
We propose a novel method FedFM, which guides each client's features to match shared category-wise anchors.
To achieve higher efficiency and flexibility, we propose a FedFM variant, called FedFM-Lite, where clients communicate with server with fewer synchronization times and communication bandwidth costs.
arXiv Detail & Related papers (2022-10-14T08:11:34Z) - Rethinking Data Heterogeneity in Federated Learning: Introducing a New
Notion and Standard Benchmarks [65.34113135080105]
We show that not only the issue of data heterogeneity in current setups is not necessarily a problem but also in fact it can be beneficial for the FL participants.
Our observations are intuitive.
Our code is available at https://github.com/MMorafah/FL-SC-NIID.
arXiv Detail & Related papers (2022-09-30T17:15:19Z) - FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling
and Correction [48.85303253333453]
Federated learning (FL) allows multiple clients to collectively train a high-performance global model without sharing their private data.
We propose a novel federated learning algorithm with local drift decoupling and correction (FedDC)
Our FedDC only introduces lightweight modifications in the local training phase, in which each client utilizes an auxiliary local drift variable to track the gap between the local model parameter and the global model parameters.
Experiment results and analysis demonstrate that FedDC yields expediting convergence and better performance on various image classification tasks.
arXiv Detail & Related papers (2022-03-22T14:06:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.