A Privacy-Preserving Subgraph-Level Federated Graph Neural Network via
Differential Privacy
- URL: http://arxiv.org/abs/2206.03492v1
- Date: Tue, 7 Jun 2022 08:14:45 GMT
- Title: A Privacy-Preserving Subgraph-Level Federated Graph Neural Network via
Differential Privacy
- Authors: Yeqing Qiu, Chenyu Huang, Jianzong Wang, Zhangcheng Huang, Jing Xiao
- Abstract summary: We propose DP-FedRec, a DP-based federated GNN to solve the non independent and identically distributed (non-IID) data problem.
DP is applied not only on the weights but also on the edges of the intersection graph from PSI to fully protect the privacy of clients.
The evaluation demonstrates DP-FedRec achieves better performance with the graph extension and DP only introduces little computations overhead.
- Score: 23.05377582226823
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Currently, the federated graph neural network (GNN) has attracted a lot of
attention due to its wide applications in reality without violating the privacy
regulations. Among all the privacy-preserving technologies, the differential
privacy (DP) is the most promising one due to its effectiveness and light
computational overhead. However, the DP-based federated GNN has not been well
investigated, especially in the sub-graph-level setting, such as the scenario
of recommendation system. The biggest challenge is how to guarantee the privacy
and solve the non independent and identically distributed (non-IID) data in
federated GNN simultaneously. In this paper, we propose DP-FedRec, a DP-based
federated GNN to fill the gap. Private Set Intersection (PSI) is leveraged to
extend the local graph for each client, and thus solve the non-IID problem.
Most importantly, DP is applied not only on the weights but also on the edges
of the intersection graph from PSI to fully protect the privacy of clients. The
evaluation demonstrates DP-FedRec achieves better performance with the graph
extension and DP only introduces little computations overhead.
Related papers
- How Private are DP-SGD Implementations? [61.19794019914523]
We show that there can be a substantial gap between the privacy analysis when using the two types of batch sampling.
Our result shows that there can be a substantial gap between the privacy analysis when using the two types of batch sampling.
arXiv Detail & Related papers (2024-03-26T13:02:43Z) - Blink: Link Local Differential Privacy in Graph Neural Networks via
Bayesian Estimation [79.64626707978418]
We propose using link local differential privacy over decentralized nodes to train graph neural networks.
Our approach spends the privacy budget separately on links and degrees of the graph for the server to better denoise the graph topology.
Our approach outperforms existing methods in terms of accuracy under varying privacy budgets.
arXiv Detail & Related papers (2023-09-06T17:53:31Z) - Differentially Private Decoupled Graph Convolutions for Multigranular
Topology Protection [38.96828804683783]
GNNs can inadvertently expose sensitive user information and interactions through their model predictions.
Applying standard DP approaches to GNNs directly is not advisable due to two main reasons.
We propose a new framework termed Graph Differential Privacy (GDP), specifically tailored to graph learning.
arXiv Detail & Related papers (2023-07-12T19:29:06Z) - ProGAP: Progressive Graph Neural Networks with Differential Privacy
Guarantees [8.79398901328539]
Graph Neural Networks (GNNs) have become a popular tool for learning on graphs, but their widespread use raises privacy concerns.
We propose a new differentially private GNN called ProGAP that uses a progressive training scheme to improve such accuracy-privacy trade-offs.
arXiv Detail & Related papers (2023-04-18T12:08:41Z) - DPAR: Decoupled Graph Neural Networks with Node-Level Differential Privacy [30.15971370844865]
We aim to achieve node-level differential privacy (DP) for training GNNs so that a node and its edges are protected.
We propose a Decoupled GNN with Differentially Private Approximate Personalized PageRank (DPAR) for training GNNs with an enhanced privacy-utility tradeoff.
arXiv Detail & Related papers (2022-10-10T05:34:25Z) - GAP: Differentially Private Graph Neural Networks with Aggregation
Perturbation [19.247325210343035]
Graph Neural Networks (GNNs) are powerful models designed for graph data that learn node representation.
Recent studies have shown that GNNs can raise significant privacy concerns when graph data contain sensitive information.
We propose GAP, a novel differentially private GNN that safeguards privacy of nodes and edges.
arXiv Detail & Related papers (2022-03-02T08:58:07Z) - Degree-Preserving Randomized Response for Graph Neural Networks under Local Differential Privacy [8.12606646175019]
We propose a novel LDP algorithm called the DPRR (Degree-Preserving Randomized Response) to provide LDP for edges in GNNs.
Our DPRR preserves each user's degree hence a graph structure while providing edge LDP.
We focus on graph classification as a task of GNNs and evaluate the DPRR using three social graph datasets.
arXiv Detail & Related papers (2022-02-21T13:35:03Z) - Privacy Amplification via Shuffling for Linear Contextual Bandits [51.94904361874446]
We study the contextual linear bandit problem with differential privacy (DP)
We show that it is possible to achieve a privacy/utility trade-off between JDP and LDP by leveraging the shuffle model of privacy.
Our result shows that it is possible to obtain a tradeoff between JDP and LDP by leveraging the shuffle model while preserving local privacy.
arXiv Detail & Related papers (2021-12-11T15:23:28Z) - Differentially Private Federated Bayesian Optimization with Distributed
Exploration [48.9049546219643]
We introduce differential privacy (DP) into the training of deep neural networks through a general framework for adding DP to iterative algorithms.
We show that DP-FTS-DE achieves high utility (competitive performance) with a strong privacy guarantee.
We also use real-world experiments to show that DP-FTS-DE induces a trade-off between privacy and utility.
arXiv Detail & Related papers (2021-10-27T04:11:06Z) - NeuralDP Differentially private neural networks by design [61.675604648670095]
We propose NeuralDP, a technique for privatising activations of some layer within a neural network.
We experimentally demonstrate on two datasets that our method offers substantially improved privacy-utility trade-offs compared to DP-SGD.
arXiv Detail & Related papers (2021-07-30T12:40:19Z) - Smoothed Differential Privacy [55.415581832037084]
Differential privacy (DP) is a widely-accepted and widely-applied notion of privacy based on worst-case analysis.
In this paper, we propose a natural extension of DP following the worst average-case idea behind the celebrated smoothed analysis.
We prove that any discrete mechanism with sampling procedures is more private than what DP predicts, while many continuous mechanisms with sampling procedures are still non-private under smoothed DP.
arXiv Detail & Related papers (2021-07-04T06:55:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.