Privacy-Preserving Representation Learning on Graphs: A Mutual
Information Perspective
- URL: http://arxiv.org/abs/2107.01475v1
- Date: Sat, 3 Jul 2021 18:09:44 GMT
- Title: Privacy-Preserving Representation Learning on Graphs: A Mutual
Information Perspective
- Authors: Binghui Wang, Jiayi Guo, Ang Li, Yiran Chen, Hai Li
- Abstract summary: Existing representation learning methods on graphs could leak serious private information.
We propose a privacy-preserving representation learning framework on graphs from the emphmutual information perspective.
- Score: 44.53121844947585
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning with graphs has attracted significant attention recently. Existing
representation learning methods on graphs have achieved state-of-the-art
performance on various graph-related tasks such as node classification, link
prediction, etc. However, we observe that these methods could leak serious
private information. For instance, one can accurately infer the links (or node
identity) in a graph from a node classifier (or link predictor) trained on the
learnt node representations by existing methods. To address the issue, we
propose a privacy-preserving representation learning framework on graphs from
the \emph{mutual information} perspective. Specifically, our framework includes
a primary learning task and a privacy protection task, and we consider node
classification and link prediction as the two tasks of interest. Our goal is to
learn node representations such that they can be used to achieve high
performance for the primary learning task, while obtaining performance for the
privacy protection task close to random guessing. We formally formulate our
goal via mutual information objectives. However, it is intractable to compute
mutual information in practice. Then, we derive tractable variational bounds
for the mutual information terms, where each bound can be parameterized via a
neural network. Next, we train these parameterized neural networks to
approximate the true mutual information and learn privacy-preserving node
representations. We finally evaluate our framework on various graph datasets.
Related papers
- Free Lunch for Privacy Preserving Distributed Graph Learning [1.8292714902548342]
We present a novel privacy-respecting framework for distributed graph learning and graph-based machine learning.
This framework aims to learn features as well as distances without requiring actual features while preserving the original structural properties of the raw data.
arXiv Detail & Related papers (2023-05-18T10:41:21Z) - Privacy-Preserved Neural Graph Similarity Learning [99.78599103903777]
We propose a novel Privacy-Preserving neural Graph Matching network model, named PPGM, for graph similarity learning.
To prevent reconstruction attacks, the proposed model does not communicate node-level representations between devices.
To alleviate the attacks to graph properties, the obfuscated features that contain information from both vectors are communicated.
arXiv Detail & Related papers (2022-10-21T04:38:25Z) - Node Representation Learning in Graph via Node-to-Neighbourhood Mutual
Information Maximization [27.701736055800314]
Key towards learning informative node representations in graphs lies in how to gain contextual information from the neighbourhood.
We present a self-supervised node representation learning strategy via directly maximizing the mutual information between the hidden representations of nodes and their neighbourhood.
Our framework is optimized via a surrogate contrastive loss, where the positive selection underpins the quality and efficiency of representation learning.
arXiv Detail & Related papers (2022-03-23T08:21:10Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Learnable Graph Matching: Incorporating Graph Partitioning with Deep
Feature Learning for Multiple Object Tracking [58.30147362745852]
Data association across frames is at the core of Multiple Object Tracking (MOT) task.
Existing methods mostly ignore the context information among tracklets and intra-frame detections.
We propose a novel learnable graph matching method to address these issues.
arXiv Detail & Related papers (2021-03-30T08:58:45Z) - Edge-Featured Graph Attention Network [7.0629162428807115]
We present edge-featured graph attention networks (EGATs) to extend the use of graph neural networks to those tasks learning on graphs with both node and edge features.
By reforming the model structure and the learning process, the new models can accept node and edge features as inputs, incorporate the edge information into feature representations, and iterate both node and edge features in a parallel but mutual way.
arXiv Detail & Related papers (2021-01-19T15:08:12Z) - Node2Seq: Towards Trainable Convolutions in Graph Neural Networks [59.378148590027735]
We propose a graph network layer, known as Node2Seq, to learn node embeddings with explicitly trainable weights for different neighboring nodes.
For a target node, our method sorts its neighboring nodes via attention mechanism and then employs 1D convolutional neural networks (CNNs) to enable explicit weights for information aggregation.
In addition, we propose to incorporate non-local information for feature learning in an adaptive manner based on the attention scores.
arXiv Detail & Related papers (2021-01-06T03:05:37Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Locally Private Graph Neural Networks [12.473486843211573]
We study the problem of node data privacy, where graph nodes have potentially sensitive data that is kept private.
We develop a privacy-preserving, architecture-agnostic GNN learning algorithm with formal privacy guarantees.
Experiments conducted over real-world datasets demonstrate that our method can maintain a satisfying level of accuracy with low privacy loss.
arXiv Detail & Related papers (2020-06-09T22:36:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.