Curriculum Guided Personalized Subgraph Federated Learning
- URL: http://arxiv.org/abs/2509.00402v1
- Date: Sat, 30 Aug 2025 08:01:36 GMT
- Title: Curriculum Guided Personalized Subgraph Federated Learning
- Authors: Minku Kang, Hogun Park,
- Abstract summary: Subgraph Federated Learning (FL) aims to train Graph Neural Networks (GNNs) across distributed private subgraphs.<n> weighted model aggregation personalizes each local GNN by assigning larger weights to parameters from clients with similar subgraph characteristics.<n>We propose a novel personalized subgraph FL framework called Curriculum guided personalized sUbgraph Federated Learning (CUFL)
- Score: 8.721619913104899
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Subgraph Federated Learning (FL) aims to train Graph Neural Networks (GNNs) across distributed private subgraphs, but it suffers from severe data heterogeneity. To mitigate data heterogeneity, weighted model aggregation personalizes each local GNN by assigning larger weights to parameters from clients with similar subgraph characteristics inferred from their current model states. However, the sparse and biased subgraphs often trigger rapid overfitting, causing the estimated client similarity matrix to stagnate or even collapse. As a result, aggregation loses effectiveness as clients reinforce their own biases instead of exploiting diverse knowledge otherwise available. To this end, we propose a novel personalized subgraph FL framework called Curriculum guided personalized sUbgraph Federated Learning (CUFL). On the client side, CUFL adopts Curriculum Learning (CL) that adaptively selects edges for training according to their reconstruction scores, exposing each GNN first to easier, generic cross-client substructures and only later to harder, client-specific ones. This paced exposure prevents early overfitting to biased patterns and enables gradual personalization. By regulating personalization, the curriculum also reshapes server aggregation from exchanging generic knowledge to propagating client-specific knowledge. Further, CUFL improves weighted aggregation by estimating client similarity using fine-grained structural indicators reconstructed on a random reference graph. Extensive experiments on six benchmark datasets confirm that CUFL achieves superior performance compared to relevant baselines. Code is available at https://github.com/Kang-Min-Ku/CUFL.git.
Related papers
- Federated Learning with Profile Mapping under Distribution Shifts and Drifts [13.33580160559305]
Federated Learning (FL) enables decentralized model training across clients without sharing raw data.<n>Existing methods often fail to address distribution shift across clients and distribution drift over time.<n>We introduce Feroma, a novel FL framework that explicitly handles both distribution shift and drift without relying on client or cluster identity.
arXiv Detail & Related papers (2026-02-07T19:31:50Z) - Personalized Subgraph Federated Learning with Sheaf Collaboration [22.825083541211168]
FedSheafHN is a novel framework built on a sheaf collaboration mechanism to unify enhanced client descriptors with efficient personalized model generation.<n>Specifically, FedSheafHN embeds each client's local subgraph into a server-constructed collaboration graph.<n>It generates customized client models via a server-optimized hypernetwork.
arXiv Detail & Related papers (2025-08-19T08:52:50Z) - Personalized Subgraph Federated Learning with Differentiable Auxiliary Projections [14.636973991912113]
We introduce Federated learning with Auxiliary projections (FedAux)<n>FedAux is a personalized subgraph FL framework that learns to align, compare, and aggregate heterogeneously distributed local models without sharing raw data or node embeddings.<n> Empirical evaluations across diverse graph benchmarks demonstrate that FedAux substantially outperforms existing baselines in both accuracy and personalization performance.
arXiv Detail & Related papers (2025-05-29T09:17:49Z) - FedHERO: A Federated Learning Approach for Node Classification Task on Heterophilic Graphs [55.51300642911766]
Federated Graph Learning (FGL) empowers clients to collaboratively train Graph neural networks (GNNs) in a distributed manner.<n>FGL methods usually require that the graph data owned by all clients is homophilic to ensure similar neighbor distribution patterns of nodes.<n>We propose FedHERO, an FGL framework designed to harness and share insights from heterophilic graphs effectively.
arXiv Detail & Related papers (2025-04-29T22:23:35Z) - SAFL: Structure-Aware Personalized Federated Learning via Client-Specific Clustering and SCSI-Guided Model Pruning [11.0839911783691]
Federated Learning (FL) enables clients to collaboratively train machine learning models without sharing local data, preserving privacy in diverse environments.<n>This paper introduces SAFL (Structure-Aware Federated Learning), a novel framework that enhances personalized federated learning through client-specific clustering and Similar Client Structure Information (SCSI)-guided model pruning.
arXiv Detail & Related papers (2025-01-30T08:59:44Z) - Federated Face Forgery Detection Learning with Personalized Representation [63.90408023506508]
Deep generator technology can produce high-quality fake videos that are indistinguishable, posing a serious social threat.
Traditional forgery detection methods directly centralized training on data.
The paper proposes a novel federated face forgery detection learning with personalized representation.
arXiv Detail & Related papers (2024-06-17T02:20:30Z) - Efficient Distribution Similarity Identification in Clustered Federated
Learning via Principal Angles Between Client Data Subspaces [59.33965805898736]
Clustered learning has been shown to produce promising results by grouping clients into clusters.
Existing FL algorithms are essentially trying to group clients together with similar distributions.
Prior FL algorithms attempt similarities indirectly during training.
arXiv Detail & Related papers (2022-09-21T17:37:54Z) - Personalized Subgraph Federated Learning [56.52903162729729]
We introduce a new subgraph FL problem, personalized subgraph FL, which focuses on the joint improvement of the interrelated local GNNs.
We propose a novel framework, FEDerated Personalized sUBgraph learning (FED-PUB), to tackle it.
We validate our FED-PUB for its subgraph FL performance on six datasets, considering both non-overlapping and overlapping subgraphs.
arXiv Detail & Related papers (2022-06-21T09:02:53Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Tackling the Local Bias in Federated Graph Learning [48.887310972708036]
In Federated graph learning (FGL), a global graph is distributed across different clients, where each client holds a subgraph.
Existing FGL methods fail to effectively utilize cross-client edges, losing structural information during the training.
We propose a novel FGL framework to make the local models similar to the model trained in a centralized setting.
arXiv Detail & Related papers (2021-10-22T08:22:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.