Toward Personalized Federated Node Classification in One-shot Communication
- URL: http://arxiv.org/abs/2411.11304v2
- Date: Sun, 24 Nov 2024 04:14:19 GMT
- Title: Toward Personalized Federated Node Classification in One-shot Communication
- Authors: Guochen Yan, Xunkai Li, Luyuan Xie, Wentao Zhang, Qingni Shen, Yuejian Fang, Zhonghai Wu,
- Abstract summary: We propose a one-shot personalized Federated Graph Learning method for node classification.
Our method estimates and aggregates class-wise feature distribution statistics to construct a global pseudo-graph on the server.
Our method significantly outperforms state-of-the-art baselines across various settings.
- Score: 27.325478113745206
- License:
- Abstract: Federated Graph Learning (FGL) has emerged as a promising paradigm for breaking data silos in distributed private graphs data management. In practical scenarios involving complex and heterogeneous distributed graph data, personalized Federated Graph Learning (pFGL) aims to enhance model utility by training personalized models tailored to individual client needs, rather than relying on a universal global model. However, existing pFGL methods often require numerous communication rounds under heterogeneous client graphs, leading to significant security concerns and communication overhead. While One-shot Federated Learning (OFL) addresses these issues by enabling collaboration in a single round, existing OFL methods are designed for image-based tasks and ineffective for graph data, leaving a critical gap in the field. Additionally, personalized models often suffer from bias, failing to generalize effectively to minority data. To address these challenges, we propose the first one-shot personalized federated graph learning method for node classification, compatible with the Secure Aggregation protocol for privacy preservation. Specifically, for effective graph learning in a single communication round, our method estimates and aggregates class-wise feature distribution statistics to construct a global pseudo-graph on the server, facilitating the training of a global graph model. Moreover, to mitigate bias, we introduce a two-stage personalized training approach that adaptively balances local personal information and global insights from the pseudo-graph, improving both personalization and generalization. Extensive experiments conducted on 8 multi-scale graph datasets demonstrate that our method significantly outperforms state-of-the-art baselines across various settings.
Related papers
- Personalized federated learning based on feature fusion [2.943623084019036]
Federated learning enables distributed clients to collaborate on training while storing their data locally to protect client privacy.
We propose a personalized federated learning approach called pFedPM.
In our process, we replace traditional gradient uploading with feature uploading, which helps reduce communication costs and allows for heterogeneous client models.
arXiv Detail & Related papers (2024-06-24T12:16:51Z) - FedSheafHN: Personalized Federated Learning on Graph-structured Data [22.825083541211168]
We propose a model called FedSheafHN, which embeds each client's local subgraph into a server-constructed collaboration graph.
Our model improves the integration and interpretation of complex client characteristics.
It also has fast model convergence and effective new clients generalization.
arXiv Detail & Related papers (2024-05-25T04:51:41Z) - APGL4SR: A Generic Framework with Adaptive and Personalized Global
Collaborative Information in Sequential Recommendation [86.29366168836141]
We propose a graph-driven framework, named Adaptive and Personalized Graph Learning for Sequential Recommendation (APGL4SR)
APGL4SR incorporates adaptive and personalized global collaborative information into sequential recommendation systems.
As a generic framework, APGL4SR can outperform other baselines with significant margins.
arXiv Detail & Related papers (2023-11-06T01:33:24Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Distributed Learning over Networks with Graph-Attention-Based
Personalization [49.90052709285814]
We propose a graph-based personalized algorithm (GATTA) for distributed deep learning.
In particular, the personalized model in each agent is composed of a global part and a node-specific part.
By treating each agent as one node in a graph the node-specific parameters as its features, the benefits of the graph attention mechanism can be inherited.
arXiv Detail & Related papers (2023-05-22T13:48:30Z) - Graph Learning Across Data Silos [12.343382413705394]
We consider the problem of inferring graph topology from smooth graph signals in a novel but practical scenario.
Data are located in distributed clients and prohibited from leaving local clients due to factors such as privacy concerns.
We propose an auto-weighted multiple graph learning model to jointly learn a personalized graph for each local client and a single consensus graph for all clients.
arXiv Detail & Related papers (2023-01-17T02:14:57Z) - FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs [22.649780281947837]
In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
arXiv Detail & Related papers (2022-08-29T15:47:36Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Privatized Graph Federated Learning [57.14673504239551]
We introduce graph federated learning, which consists of multiple units connected by a graph.
We show how graph homomorphic perturbations can be used to ensure the algorithm is differentially private.
arXiv Detail & Related papers (2022-03-14T13:48:23Z) - Tackling the Local Bias in Federated Graph Learning [48.887310972708036]
In Federated graph learning (FGL), a global graph is distributed across different clients, where each client holds a subgraph.
Existing FGL methods fail to effectively utilize cross-client edges, losing structural information during the training.
We propose a novel FGL framework to make the local models similar to the model trained in a centralized setting.
arXiv Detail & Related papers (2021-10-22T08:22:36Z) - FedGL: Federated Graph Learning Framework with Global Self-Supervision [22.124339267195822]
FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
arXiv Detail & Related papers (2021-05-07T11:27:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.