Graph Neural Networks Designed for Different Graph Types: A Survey
- URL: http://arxiv.org/abs/2204.03080v5
- Date: Wed, 26 Apr 2023 11:27:50 GMT
- Title: Graph Neural Networks Designed for Different Graph Types: A Survey
- Authors: Josephine M. Thomas and Alice Moallemy-Oureh and Silvia Beddar-Wiesing
and Clara Holzh\"uter
- Abstract summary: Graph Neural Networks (GNNs) address cutting-edge problems based on graph data.
It has not yet been gathered which GNN can process what kind of graph types.
We give a detailed overview of already existing GNNs and categorize them according to their ability to handle different graph types.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graphs are ubiquitous in nature and can therefore serve as models for many
practical but also theoretical problems. For this purpose, they can be defined
as many different types which suitably reflect the individual contexts of the
represented problem. To address cutting-edge problems based on graph data, the
research field of Graph Neural Networks (GNNs) has emerged. Despite the field's
youth and the speed at which new models are developed, many recent surveys have
been published to keep track of them. Nevertheless, it has not yet been
gathered which GNN can process what kind of graph types. In this survey, we
give a detailed overview of already existing GNNs and, unlike previous surveys,
categorize them according to their ability to handle different graph types and
properties. We consider GNNs operating on static and dynamic graphs of
different structural constitutions, with or without node or edge attributes.
Moreover, we distinguish between GNN models for discrete-time or
continuous-time dynamic graphs and group the models according to their
architecture. We find that there are still graph types that are not or only
rarely covered by existing GNN models. We point out where models are missing
and give potential reasons for their absence.
Related papers
- Graph Rewriting for Graph Neural Networks [0.0]
Graph Neural Networks (GNNs) support the inference of nodes, edges, attributes, or graph properties.
Graph Rewriting investigates the rule-based manipulation of graphs to model complex graph transformations.
arXiv Detail & Related papers (2023-05-29T21:48:19Z) - What Do GNNs Actually Learn? Towards Understanding their Representations [26.77596449192451]
We investigate which properties of graphs are captured purely by graph neural networks (GNNs)
We show that two of them embed all nodes into the same feature vector, while the other two models generate representations related to the number of walks over the input graph.
Strikingly, structurally dissimilar nodes can have similar representations at some layer $k>1$, if they have the same number of walks of length $k$.
arXiv Detail & Related papers (2023-04-21T09:52:19Z) - Probing Graph Representations [77.7361299039905]
We use a probing framework to quantify the amount of meaningful information captured in graph representations.
Our findings on molecular datasets show the potential of probing for understanding the inductive biases of graph-based models.
We advocate for probing as a useful diagnostic tool for evaluating graph-based models.
arXiv Detail & Related papers (2023-03-07T14:58:18Z) - A Topological characterisation of Weisfeiler-Leman equivalence classes [0.0]
Graph Neural Networks (GNNs) are learning models aimed at processing graphs and signals on graphs.
In this article, we rely on the theory of covering spaces to fully characterize the classes of graphs that GNNs cannot distinguish.
We show that the number of indistinguishable graphs in our dataset grows super-exponentially with the number of nodes.
arXiv Detail & Related papers (2022-06-23T17:28:55Z) - Graph-level Neural Networks: Current Progress and Future Directions [61.08696673768116]
Graph-level Neural Networks (GLNNs, deep learning-based graph-level learning methods) have been attractive due to their superiority in modeling high-dimensional data.
We propose a systematic taxonomy covering GLNNs upon deep neural networks, graph neural networks, and graph pooling.
arXiv Detail & Related papers (2022-05-31T06:16:55Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Meta-Inductive Node Classification across Graphs [6.0471030308057285]
We propose a novel meta-inductive framework called MI-GNN to customize the inductive model to each graph.
MI-GNN does not directly learn an inductive model; it learns the general knowledge of how to train a model for semi-supervised node classification on new graphs.
Extensive experiments on five real-world graph collections demonstrate the effectiveness of our proposed model.
arXiv Detail & Related papers (2021-05-14T09:16:28Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z) - Incomplete Graph Representation and Learning via Partial Graph Neural
Networks [7.227805463462352]
In many applications, graph may be coming in an incomplete form where attributes of graph nodes are partially unknown/missing.
Existing GNNs are generally designed on complete graphs which can not deal with attribute-incomplete graph data directly.
We develop a novel partial aggregation based GNNs, named Partial Graph Neural Networks (PaGNNs) for attribute-incomplete graph representation and learning.
arXiv Detail & Related papers (2020-03-23T08:29:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.