A Vertical Federated Learning Framework for Graph Convolutional Network
- URL: http://arxiv.org/abs/2106.11593v1
- Date: Tue, 22 Jun 2021 07:57:46 GMT
- Title: A Vertical Federated Learning Framework for Graph Convolutional Network
- Authors: Xiang Ni, Xiaolong Xu, Lingjuan Lyu, Changhua Meng, Weiqiang Wang
- Abstract summary: FedVGCN is a learning paradigm for privacy-preserving node classification task under data vertically partitioned setting.
For each of the training process, the two parties transfer intermediate results to each other under homomorphic encryption.
We conduct experiments on benchmark data and the results demonstrate the effectiveness of FedVGCN in the case of GraphSage.
- Score: 12.684113617570643
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Recently, Graph Neural Network (GNN) has achieved remarkable success in
various real-world problems on graph data. However in most industries, data
exists in the form of isolated islands and the data privacy and security is
also an important issue. In this paper, we propose FedVGCN, a federated GCN
learning paradigm for privacy-preserving node classification task under data
vertically partitioned setting, which can be generalized to existing GCN
models. Specifically, we split the computation graph data into two parts. For
each iteration of the training process, the two parties transfer intermediate
results to each other under homomorphic encryption. We conduct experiments on
benchmark data and the results demonstrate the effectiveness of FedVGCN in the
case of GraphSage.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - Federated Learning over Coupled Graphs [39.86903030911785]
Federated Learning (FL) has been proposed to solve the data isolation issue, mainly for Euclidean data.
We propose a novel FL framework for graph data, FedCog, to efficiently handle coupled graphs that are a kind of distributed graph data.
arXiv Detail & Related papers (2023-01-26T13:43:26Z) - FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs [22.649780281947837]
In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
arXiv Detail & Related papers (2022-08-29T15:47:36Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - FedGraph: Federated Graph Learning with Intelligent Sampling [7.798227884125872]
Federated learning has attracted much research attention due to its privacy protection in distributed machine learning.
Existing work of federated learning mainly focuses on Convolutional Neural Network (CNN), which cannot efficiently handle graph data that are popular in many applications.
In this paper, we propose FedGraph for federated graph learning among multiple computing clients, each of which holds a subgraph.
arXiv Detail & Related papers (2021-11-02T04:58:03Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z) - Simple and Deep Graph Convolutional Networks [63.76221532439285]
Graph convolutional networks (GCNs) are a powerful deep learning approach for graph-structured data.
Despite their success, most of the current GCN models are shallow, due to the em over-smoothing problem.
We propose the GCNII, an extension of the vanilla GCN model with two simple yet effective techniques.
arXiv Detail & Related papers (2020-07-04T16:18:06Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Vertically Federated Graph Neural Network for Privacy-Preserving Node
Classification [39.53937689989282]
VFGNN is a learning paradigm for privacy-preserving node classification task under data vertically partitioned setting.
We leave the private data related computations on data holders, and delegate the rest of computations to a semi-honest server.
We conduct experiments on three benchmarks and the results demonstrate the effectiveness of VFGNN.
arXiv Detail & Related papers (2020-05-25T03:12:18Z) - An Uncoupled Training Architecture for Large Graph Learning [20.784230322205232]
We present Node2Grids, a flexible uncoupled training framework for embedding graph data into grid-like data.
By ranking each node's influence through degree, Node2Grids selects the most influential first-order as well as second-order neighbors with central node fusion information.
For further improving the efficiency of downstream tasks, a simple CNN-based neural network is employed to capture the significant information from the mapped grid-like data.
arXiv Detail & Related papers (2020-03-21T11:49:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.