Efficient Graph Deep Learning in TensorFlow with tf_geometric
- URL: http://arxiv.org/abs/2101.11552v1
- Date: Wed, 27 Jan 2021 17:16:36 GMT
- Title: Efficient Graph Deep Learning in TensorFlow with tf_geometric
- Authors: Jun Hu, Shengsheng Qian, Quan Fang, Youze Wang, Quan Zhao, Huaiwen
Zhang, Changsheng Xu
- Abstract summary: We introduce tf_geometric, an efficient and friendly library for graph deep learning.
tf_geometric provides kernel libraries for building Graph Neural Networks (GNNs) as well as implementations of popular GNNs.
The kernel libraries consist of infrastructures for building efficient GNNs, including graph data structures, graph map-reduce framework, graph mini-batch strategy, etc.
- Score: 53.237754811019464
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce tf_geometric, an efficient and friendly library for graph deep
learning, which is compatible with both TensorFlow 1.x and 2.x. tf_geometric
provides kernel libraries for building Graph Neural Networks (GNNs) as well as
implementations of popular GNNs. The kernel libraries consist of
infrastructures for building efficient GNNs, including graph data structures,
graph map-reduce framework, graph mini-batch strategy, etc. These
infrastructures enable tf_geometric to support single-graph computation,
multi-graph computation, graph mini-batch, distributed training, etc.;
therefore, tf_geometric can be used for a variety of graph deep learning tasks,
such as transductive node classification, inductive node classification, link
prediction, and graph classification. Based on the kernel libraries,
tf_geometric implements a variety of popular GNN models for different tasks. To
facilitate the implementation of GNNs, tf_geometric also provides some other
libraries for dataset management, graph sampling, etc. Different from existing
popular GNN libraries, tf_geometric provides not only Object-Oriented
Programming (OOP) APIs, but also Functional APIs, which enable tf_geometric to
handle advanced graph deep learning tasks such as graph meta-learning. The APIs
of tf_geometric are friendly, and they are suitable for both beginners and
experts. In this paper, we first present an overview of tf_geometric's
framework. Then, we conduct experiments on some benchmark datasets and report
the performance of several popular GNN models implemented by tf_geometric.
Related papers
- MGNet: Learning Correspondences via Multiple Graphs [78.0117352211091]
Learning correspondences aims to find correct correspondences from the initial correspondence set with an uneven correspondence distribution and a low inlier rate.
Recent advances usually use graph neural networks (GNNs) to build a single type of graph or stack local graphs into the global one to complete the task.
We propose MGNet to effectively combine multiple complementary graphs.
arXiv Detail & Related papers (2024-01-10T07:58:44Z) - GRAN is superior to GraphRNN: node orderings, kernel- and graph
embeddings-based metrics for graph generators [0.6816499294108261]
We study kernel-based metrics on distributions of graph invariants and manifold-based and kernel-based metrics in graph embedding space.
We compare GraphRNN and GRAN, two well-known generative models for graphs, and unveil the influence of node orderings.
arXiv Detail & Related papers (2023-07-13T12:07:39Z) - Graphtester: Exploring Theoretical Boundaries of GNNs on Graph Datasets [10.590698823137755]
We provide a new tool called Graphtester for a comprehensive analysis of the theoretical capabilities of GNNs for various datasets, tasks, and scores.
We use Graphtester to analyze over 40 different graph datasets, determining upper bounds on the performance of various GNNs based on the number of layers.
We show that the tool can also be used for Graph Transformers using positional node encodings, thereby expanding its scope.
arXiv Detail & Related papers (2023-06-30T08:53:23Z) - pyGSL: A Graph Structure Learning Toolkit [14.000763778781547]
pyGSL is a Python library that provides efficient implementations of state-of-the-art graph structure learning models.
pyGSL is written in GPU-friendly ways, allowing one to scale to much larger network tasks.
arXiv Detail & Related papers (2022-11-07T14:23:10Z) - NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search [55.75621026447599]
We propose NAS-Bench-Graph, a tailored benchmark that supports unified, reproducible, and efficient evaluations for GraphNAS.
Specifically, we construct a unified, expressive yet compact search space, covering 26,206 unique graph neural network (GNN) architectures.
Based on our proposed benchmark, the performance of GNN architectures can be directly obtained by a look-up table without any further computation.
arXiv Detail & Related papers (2022-06-18T10:17:15Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Node Feature Extraction by Self-Supervised Multi-scale Neighborhood
Prediction [123.20238648121445]
We propose a new self-supervised learning framework, Graph Information Aided Node feature exTraction (GIANT)
GIANT makes use of the eXtreme Multi-label Classification (XMC) formalism, which is crucial for fine-tuning the language model based on graph information.
We demonstrate the superior performance of GIANT over the standard GNN pipeline on Open Graph Benchmark datasets.
arXiv Detail & Related papers (2021-10-29T19:55:12Z) - Graph Traversal with Tensor Functionals: A Meta-Algorithm for Scalable
Learning [29.06880988563846]
Graph Traversal via Functionals(GTTF) is a unifying meta-algorithm framework for embedding graph algorithms.
We show for a wide class of methods, our learns in an unbiased fashion and, in expectation, approximates the learning as if the specialized implementations were run directly.
arXiv Detail & Related papers (2021-02-08T16:52:52Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.