Vertically Federated Graph Neural Network for Privacy-Preserving Node
Classification
- URL: http://arxiv.org/abs/2005.11903v3
- Date: Mon, 25 Apr 2022 03:04:30 GMT
- Title: Vertically Federated Graph Neural Network for Privacy-Preserving Node
Classification
- Authors: Chaochao Chen, Jun Zhou, Longfei Zheng, Huiwen Wu, Lingjuan Lyu, Jia
Wu, Bingzhe Wu, Ziqi Liu, Li Wang, Xiaolin Zheng
- Abstract summary: VFGNN is a learning paradigm for privacy-preserving node classification task under data vertically partitioned setting.
We leave the private data related computations on data holders, and delegate the rest of computations to a semi-honest server.
We conduct experiments on three benchmarks and the results demonstrate the effectiveness of VFGNN.
- Score: 39.53937689989282
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, Graph Neural Network (GNN) has achieved remarkable progresses in
various real-world tasks on graph data, consisting of node features and the
adjacent information between different nodes. High-performance GNN models
always depend on both rich features and complete edge information in graph.
However, such information could possibly be isolated by different data holders
in practice, which is the so-called data isolation problem. To solve this
problem, in this paper, we propose VFGNN, a federated GNN learning paradigm for
privacy-preserving node classification task under data vertically partitioned
setting, which can be generalized to existing GNN models. Specifically, we
split the computation graph into two parts. We leave the private data (i.e.,
features, edges, and labels) related computations on data holders, and delegate
the rest of computations to a semi-honest server. We also propose to apply
differential privacy to prevent potential information leakage from the server.
We conduct experiments on three benchmarks and the results demonstrate the
effectiveness of VFGNN.
Related papers
- GNN-MultiFix: Addressing the pitfalls for GNNs for multi-label node classification [1.857645719601748]
Graph neural networks (GNNs) have emerged as powerful models for learning representations of graph data.
We show that even the most expressive GNN may fail to learn in absence of node attributes and without using explicit label information as input.
We propose a straightforward approach, referred to as GNN-MultiFix, that integrates the feature, label, and positional information of a node.
arXiv Detail & Related papers (2024-11-21T12:59:39Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - SplitGNN: Splitting GNN for Node Classification with Heterogeneous
Attention [29.307331758493323]
We propose a split learning-based graph neural network (SplitGNN) for graph computation.
Our SplitGNN allows the isolated heterogeneous neighborhood to be collaboratively utilized.
We demonstrate the effectiveness of our SplitGNN on node classification tasks for two standard public datasets and the real-world dataset.
arXiv Detail & Related papers (2023-01-27T12:08:44Z) - Hierarchical Model Selection for Graph Neural Netoworks [0.0]
We propose a hierarchical model selection framework (HMSF) that selects an appropriate graph neural network (GNN) model by analyzing the indicators of each graph data.
In the experiment, we show that the model selected by our HMSF achieves high performance on node classification for various types of graph data.
arXiv Detail & Related papers (2022-12-01T22:31:21Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - A Vertical Federated Learning Framework for Graph Convolutional Network [12.684113617570643]
FedVGCN is a learning paradigm for privacy-preserving node classification task under data vertically partitioned setting.
For each of the training process, the two parties transfer intermediate results to each other under homomorphic encryption.
We conduct experiments on benchmark data and the results demonstrate the effectiveness of FedVGCN in the case of GraphSage.
arXiv Detail & Related papers (2021-06-22T07:57:46Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Locally Private Graph Neural Networks [12.473486843211573]
We study the problem of node data privacy, where graph nodes have potentially sensitive data that is kept private.
We develop a privacy-preserving, architecture-agnostic GNN learning algorithm with formal privacy guarantees.
Experiments conducted over real-world datasets demonstrate that our method can maintain a satisfying level of accuracy with low privacy loss.
arXiv Detail & Related papers (2020-06-09T22:36:06Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.