Towards a Taxonomy of Graph Learning Datasets
- URL: http://arxiv.org/abs/2110.14809v1
- Date: Wed, 27 Oct 2021 23:08:01 GMT
- Title: Towards a Taxonomy of Graph Learning Datasets
- Authors: Renming Liu, Semih Cant\"urk, Frederik Wenkel, Dylan Sandfelder, Devin
Kreuzer, Anna Little, Sarah McGuire, Leslie O'Bray, Michael Perlmutter,
Bastian Rieck, Matthew Hirn, Guy Wolf and Ladislav Ramp\'a\v{s}ek
- Abstract summary: Graph neural networks (GNNs) have attracted much attention due to their ability to leverage the intrinsic geometries of the underlying data.
Here, we provide a principled approach to taxonomize graph benchmarking datasets by carefully designing a collection of graph perturbations.
Our data-driven taxonomization of graph datasets provides a new understanding of critical dataset characteristics.
- Score: 10.151886932716518
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have attracted much attention due to their
ability to leverage the intrinsic geometries of the underlying data. Although
many different types of GNN models have been developed, with many benchmarking
procedures to demonstrate the superiority of one GNN model over the others,
there is a lack of systematic understanding of the underlying benchmarking
datasets, and what aspects of the model are being tested. Here, we provide a
principled approach to taxonomize graph benchmarking datasets by carefully
designing a collection of graph perturbations to probe the essential data
characteristics that GNN models leverage to perform predictions. Our
data-driven taxonomization of graph datasets provides a new understanding of
critical dataset characteristics that will enable better model evaluation and
the development of more specialized GNN models.
Related papers
- A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - A Metadata-Driven Approach to Understand Graph Neural Networks [17.240017543449735]
We propose a $textitmetadata-driven$ approach to analyze the sensitivity of GNNs to graph data properties.
Our theoretical findings reveal that datasets with more balanced degree distribution exhibit better linear separability of node representations.
arXiv Detail & Related papers (2023-10-30T04:25:02Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Global Minima, Recoverability Thresholds, and Higher-Order Structure in
GNNS [0.0]
We analyze the performance of graph neural network (GNN) architectures from the perspective of random graph theory.
We show how both specific higher-order structures in synthetic data and the mix of empirical structures in real data have dramatic effects on GNN performance.
arXiv Detail & Related papers (2023-10-11T17:16:33Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Analysis of different temporal graph neural network configurations on
dynamic graphs [0.0]
This project aims to address the gap in the literature by performing a qualitative analysis of spatial-temporal dependence structure learning on dynamic graphs.
An extensive ablation study will be conducted on different variants of the best-performing TGN to identify the key factors contributing to its performance.
By achieving these objectives, this project will provide valuable insights into the design and optimization of TGNs for dynamic graph analysis.
arXiv Detail & Related papers (2023-05-02T00:07:33Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Taxonomy of Benchmarks in Graph Representation Learning [14.358071994798964]
Graph Neural Networks (GNNs) extend the success of neural networks to graph-structured data by accounting for their intrinsic geometry.
It is currently not well understood what aspects of a given model are probed by graph representation learning benchmarks.
Here, we develop a principled approach to taxonomize benchmarking datasets according to a $textitsensitivity profile$ that is based on how much GNN performance changes due to a collection of graph perturbations.
arXiv Detail & Related papers (2022-06-15T18:01:10Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.