Towards Private Learning on Decentralized Graphs with Local Differential
Privacy
- URL: http://arxiv.org/abs/2201.09398v1
- Date: Sun, 23 Jan 2022 23:20:56 GMT
- Title: Towards Private Learning on Decentralized Graphs with Local Differential
Privacy
- Authors: Wanyu Lin, Baochun Li and Cong Wang
- Abstract summary: em Solitude is a new privacy-preserving learning framework based on graph neural networks (GNNs)
Our new framework can simultaneously protect node feature privacy and edge privacy, and can seamlessly incorporate with any GNN with privacy-utility guarantees.
- Score: 45.47822758278652
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many real-world networks are inherently decentralized. For example, in social
networks, each user maintains a local view of a social graph, such as a list of
friends and her profile. It is typical to collect these local views of social
graphs and conduct graph learning tasks. However, learning over graphs can
raise privacy concerns as these local views often contain sensitive
information.
In this paper, we seek to ensure private graph learning on a decentralized
network graph. Towards this objective, we propose {\em Solitude}, a new
privacy-preserving learning framework based on graph neural networks (GNNs),
with formal privacy guarantees based on edge local differential privacy. The
crux of {\em Solitude} is a set of new delicate mechanisms that can calibrate
the introduced noise in the decentralized graph collected from the users. The
principle behind the calibration is the intrinsic properties shared by many
real-world graphs, such as sparsity. Unlike existing work on locally private
GNNs, our new framework can simultaneously protect node feature privacy and
edge privacy, and can seamlessly incorporate with any GNN with privacy-utility
guarantees. Extensive experiments on benchmarking datasets show that {\em
Solitude} can retain the generalization capability of the learned GNN while
preserving the users' data privacy under given privacy budgets.
Related papers
- Privacy-Preserving Graph Embedding based on Local Differential Privacy [26.164722283887333]
We introduce a novel privacy-preserving graph embedding framework, named PrivGE, to protect node data privacy.
Specifically, we propose an LDP mechanism to obfuscate node data and utilize personalized PageRank as the proximity measure to learn node representations.
Experiments on several real-world graph datasets demonstrate that PrivGE achieves an optimal balance between privacy and utility.
arXiv Detail & Related papers (2023-10-17T08:06:08Z) - Blink: Link Local Differential Privacy in Graph Neural Networks via
Bayesian Estimation [79.64626707978418]
We propose using link local differential privacy over decentralized nodes to train graph neural networks.
Our approach spends the privacy budget separately on links and degrees of the graph for the server to better denoise the graph topology.
Our approach outperforms existing methods in terms of accuracy under varying privacy budgets.
arXiv Detail & Related papers (2023-09-06T17:53:31Z) - A Survey on Privacy in Graph Neural Networks: Attacks, Preservation, and
Applications [76.88662943995641]
Graph Neural Networks (GNNs) have gained significant attention owing to their ability to handle graph-structured data.
To address this issue, researchers have started to develop privacy-preserving GNNs.
Despite this progress, there is a lack of a comprehensive overview of the attacks and the techniques for preserving privacy in the graph domain.
arXiv Detail & Related papers (2023-08-31T00:31:08Z) - Independent Distribution Regularization for Private Graph Embedding [55.24441467292359]
Graph embeddings are susceptible to attribute inference attacks, which allow attackers to infer private node attributes from the learned graph embeddings.
To address these concerns, privacy-preserving graph embedding methods have emerged.
We propose a novel approach called Private Variational Graph AutoEncoders (PVGAE) with the aid of independent distribution penalty as a regularization term.
arXiv Detail & Related papers (2023-08-16T13:32:43Z) - Decentralized Graph Neural Network for Privacy-Preserving Recommendation [21.37022040905403]
This paper proposes DGREC, a novel decentralized GNN for privacy-preserving recommendations.
It includes three stages, i.e., graph construction, local gradient calculation, and global gradient passing.
We conduct extensive experiments on three public datasets to validate the consistent superiority of our framework.
arXiv Detail & Related papers (2023-08-15T23:56:44Z) - Differentially Private Graph Neural Network with Importance-Grained
Noise Adaption [6.319864669924721]
Graph Neural Networks (GNNs) with differential privacy have been proposed to preserve graph privacy when nodes represent personal and sensitive information.
We study the problem of importance-grained privacy, where nodes contain personal data that need to be kept private but are critical for training a GNN.
We propose NAP-GNN, a node-grained privacy-preserving GNN algorithm with privacy guarantees based on adaptive differential privacy to safeguard node information.
arXiv Detail & Related papers (2023-08-09T13:18:41Z) - GAP: Differentially Private Graph Neural Networks with Aggregation
Perturbation [19.247325210343035]
Graph Neural Networks (GNNs) are powerful models designed for graph data that learn node representation.
Recent studies have shown that GNNs can raise significant privacy concerns when graph data contain sensitive information.
We propose GAP, a novel differentially private GNN that safeguards privacy of nodes and edges.
arXiv Detail & Related papers (2022-03-02T08:58:07Z) - Federated Social Recommendation with Graph Neural Network [69.36135187771929]
We propose fusing social information with user-item interactions to alleviate it, which is the social recommendation problem.
We devise a novel framework textbfFedrated textbfSocial recommendation with textbfGraph neural network (FeSoG)
arXiv Detail & Related papers (2021-11-21T09:41:39Z) - Locally Private Graph Neural Networks [12.473486843211573]
We study the problem of node data privacy, where graph nodes have potentially sensitive data that is kept private.
We develop a privacy-preserving, architecture-agnostic GNN learning algorithm with formal privacy guarantees.
Experiments conducted over real-world datasets demonstrate that our method can maintain a satisfying level of accuracy with low privacy loss.
arXiv Detail & Related papers (2020-06-09T22:36:06Z) - PGLP: Customizable and Rigorous Location Privacy through Policy Graph [68.3736286350014]
We propose a new location privacy notion called PGLP, which provides a rich interface to release private locations with customizable and rigorous privacy guarantee.
Specifically, we formalize a user's location privacy requirements using a textitlocation policy graph, which is expressive and customizable.
Third, we design a private location trace release framework that pipelines the detection of location exposure, policy graph repair, and private trajectory release with customizable and rigorous location privacy.
arXiv Detail & Related papers (2020-05-04T04:25:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.