Federated Graph Neural Networks: Overview, Techniques and Challenges
- URL: http://arxiv.org/abs/2202.07256v1
- Date: Tue, 15 Feb 2022 09:05:35 GMT
- Title: Federated Graph Neural Networks: Overview, Techniques and Challenges
- Authors: Rui Liu and Han Yu
- Abstract summary: Graph neural networks (GNNs) have received significant research attention.
As societies become increasingly concerned with data privacy, GNNs face the need to adapt to this new normal.
This has led to the rapid development of federated graph neural networks (FedGNNs) research in recent years.
- Score: 16.62839758251491
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With its powerful capability to deal with graph data widely found in
practical applications, graph neural networks (GNNs) have received significant
research attention. However, as societies become increasingly concerned with
data privacy, GNNs face the need to adapt to this new normal. This has led to
the rapid development of federated graph neural networks (FedGNNs) research in
recent years. Although promising, this interdisciplinary field is highly
challenging for interested researchers to enter into. The lack of an insightful
survey on this topic only exacerbates this problem. In this paper, we bridge
this gap by offering a comprehensive survey of this emerging field. We propose
a unique 3-tiered taxonomy of the FedGNNs literature to provide a clear view
into how GNNs work in the context of Federated Learning (FL). It puts existing
works into perspective by analyzing how graph data manifest themselves in FL
settings, how GNN training is performed under different FL system architectures
and degrees of graph data overlap across data silo, and how GNN aggregation is
performed under various FL settings. Through discussions of the advantages and
limitations of existing works, we envision future research directions that can
help build more robust, dynamic, efficient, and interpretable FedGNNs.
Related papers
- When Graph Neural Network Meets Causality: Opportunities, Methodologies and An Outlook [23.45046265345568]
Graph Neural Networks (GNNs) have emerged as powerful representation learning tools for capturing complex dependencies within diverse graph-structured data.
GNNs have raised serious concerns regarding their trustworthiness, including susceptibility to distribution shift, biases towards certain populations, and lack of explainability.
Integrating causal learning techniques into GNNs has sparked numerous ground-breaking studies since many GNN trustworthiness issues can be alleviated.
arXiv Detail & Related papers (2023-12-19T13:26:14Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - Over-Squashing in Graph Neural Networks: A Comprehensive survey [0.0]
This survey delves into the challenge of over-squashing in Graph Neural Networks (GNNs)
It comprehensively explores the causes, consequences, and mitigation strategies for over-squashing.
Various methodologies are reviewed, including graph rewiring, novel normalization, spectral analysis, and curvature-based strategies.
The survey also discusses the interplay between over-squashing and other GNN limitations, such as over-smoothing.
arXiv Detail & Related papers (2023-08-29T18:46:15Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Graph-level Neural Networks: Current Progress and Future Directions [61.08696673768116]
Graph-level Neural Networks (GLNNs, deep learning-based graph-level learning methods) have been attractive due to their superiority in modeling high-dimensional data.
We propose a systematic taxonomy covering GLNNs upon deep neural networks, graph neural networks, and graph pooling.
arXiv Detail & Related papers (2022-05-31T06:16:55Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - FedGraphNN: A Federated Learning System and Benchmark for Graph Neural
Networks [68.64678614325193]
Graph Neural Network (GNN) research is rapidly growing thanks to the capacity of GNNs to learn representations from graph-structured data.
Centralizing a massive amount of real-world graph data for GNN training is prohibitive due to user-side privacy concerns.
We introduce FedGraphNN, an open research federated learning system and a benchmark to facilitate GNN-based FL research.
arXiv Detail & Related papers (2021-04-14T22:11:35Z) - Computing Graph Neural Networks: A Survey from Algorithms to
Accelerators [2.491032752533246]
Graph Neural Networks (GNNs) have exploded onto the machine learning scene in recent years owing to their capability to model and learn from graph-structured data.
This paper aims to make two main contributions: a review of the field of GNNs is presented from the perspective of computing.
An in-depth analysis of current software and hardware acceleration schemes is provided.
arXiv Detail & Related papers (2020-09-30T22:29:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.