Privacy-Preserving Graph Machine Learning from Data to Computation: A
Survey
- URL: http://arxiv.org/abs/2307.04338v1
- Date: Mon, 10 Jul 2023 04:30:23 GMT
- Title: Privacy-Preserving Graph Machine Learning from Data to Computation: A
Survey
- Authors: Dongqi Fu, Wenxuan Bao, Ross Maciejewski, Hanghang Tong, Jingrui He
- Abstract summary: We focus on reviewing privacy-preserving techniques of graph machine learning.
We first review methods for generating privacy-preserving graph data.
Then we describe methods for transmitting privacy-preserved information.
- Score: 67.7834898542701
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In graph machine learning, data collection, sharing, and analysis often
involve multiple parties, each of which may require varying levels of data
security and privacy. To this end, preserving privacy is of great importance in
protecting sensitive information. In the era of big data, the relationships
among data entities have become unprecedentedly complex, and more applications
utilize advanced data structures (i.e., graphs) that can support network
structures and relevant attribute information. To date, many graph-based AI
models have been proposed (e.g., graph neural networks) for various domain
tasks, like computer vision and natural language processing. In this paper, we
focus on reviewing privacy-preserving techniques of graph machine learning. We
systematically review related works from the data to the computational aspects.
We first review methods for generating privacy-preserving graph data. Then we
describe methods for transmitting privacy-preserved information (e.g., graph
model parameters) to realize the optimization-based computation when data
sharing among multiple parties is risky or impossible. In addition to
discussing relevant theoretical methodology and software tools, we also discuss
current challenges and highlight several possible future research opportunities
for privacy-preserving graph machine learning. Finally, we envision a unified
and comprehensive secure graph machine learning system.
Related papers
- Privately Learning from Graphs with Applications in Fine-tuning Large Language Models [16.972086279204174]
relational data in sensitive domains such as finance and healthcare often contain private information.
Existing privacy-preserving methods, such as DP-SGD, are not well-suited for relational learning.
We propose a privacy-preserving relational learning pipeline that decouples dependencies in sampled relations during training.
arXiv Detail & Related papers (2024-10-10T18:38:38Z) - On Responsible Machine Learning Datasets with Fairness, Privacy, and Regulatory Norms [56.119374302685934]
There have been severe concerns over the trustworthiness of AI technologies.
Machine and deep learning algorithms depend heavily on the data used during their development.
We propose a framework to evaluate the datasets through a responsible rubric.
arXiv Detail & Related papers (2023-10-24T14:01:53Z) - Data-centric Graph Learning: A Survey [37.849198493911736]
We propose a novel taxonomy based on the stages in the graph learning pipeline.
We analyze some potential problems embedded in graph data and discuss how to solve them in a data-centric manner.
arXiv Detail & Related papers (2023-10-08T03:17:22Z) - Towards Data-centric Graph Machine Learning: Review and Outlook [120.64417630324378]
We introduce a systematic framework, Data-centric Graph Machine Learning (DC-GML), that encompasses all stages of the graph data lifecycle.
A thorough taxonomy of each stage is presented to answer three critical graph-centric questions.
We pinpoint the future prospects of the DC-GML domain, providing insights to navigate its advancements and applications.
arXiv Detail & Related papers (2023-09-20T00:40:13Z) - Independent Distribution Regularization for Private Graph Embedding [55.24441467292359]
Graph embeddings are susceptible to attribute inference attacks, which allow attackers to infer private node attributes from the learned graph embeddings.
To address these concerns, privacy-preserving graph embedding methods have emerged.
We propose a novel approach called Private Variational Graph AutoEncoders (PVGAE) with the aid of independent distribution penalty as a regularization term.
arXiv Detail & Related papers (2023-08-16T13:32:43Z) - Free Lunch for Privacy Preserving Distributed Graph Learning [1.8292714902548342]
We present a novel privacy-respecting framework for distributed graph learning and graph-based machine learning.
This framework aims to learn features as well as distances without requiring actual features while preserving the original structural properties of the raw data.
arXiv Detail & Related papers (2023-05-18T10:41:21Z) - Privacy-preserving Graph Analytics: Secure Generation and Federated
Learning [72.90158604032194]
We focus on the privacy-preserving analysis of graph data, which provides the crucial capacity to represent rich attributes and relationships.
We discuss two directions, namely privacy-preserving graph generation and federated graph learning, which can jointly enable the collaboration among multiple parties each possessing private graph data.
arXiv Detail & Related papers (2022-06-30T18:26:57Z) - Data Augmentation for Deep Graph Learning: A Survey [66.04015540536027]
We first propose a taxonomy for graph data augmentation and then provide a structured review by categorizing the related work based on the augmented information modalities.
Focusing on the two challenging problems in DGL (i.e., optimal graph learning and low-resource graph learning), we also discuss and review the existing learning paradigms which are based on graph data augmentation.
arXiv Detail & Related papers (2022-02-16T18:30:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.