DIG: A Turnkey Library for Diving into Graph Deep Learning Research
- URL: http://arxiv.org/abs/2103.12608v1
- Date: Tue, 23 Mar 2021 15:05:10 GMT
- Title: DIG: A Turnkey Library for Diving into Graph Deep Learning Research
- Authors: Meng Liu, Youzhi Luo, Limei Wang, Yaochen Xie, Hao Yuan, Shurui Gui,
Zhao Xu, Haiyang Yu, Jingtun Zhang, Yi Liu, Keqiang Yan, Bora Oztekin, Haoran
Liu, Xuan Zhang, Cong Fu, Shuiwang Ji
- Abstract summary: DIG: Dive into Graphs is a research-oriented library that integrates and unified implementations of common graph deep learning algorithms for several advanced tasks.
For each direction, we provide unified implementations of data interfaces, common algorithms, and evaluation metrics.
- Score: 39.58666190541479
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although there exist several libraries for deep learning on graphs, they are
aiming at implementing basic operations for graph deep learning. In the
research community, implementing and benchmarking various advanced tasks are
still painful and time-consuming with existing libraries. To facilitate graph
deep learning research, we introduce DIG: Dive into Graphs, a research-oriented
library that integrates unified and extensible implementations of common graph
deep learning algorithms for several advanced tasks. Currently, we consider
graph generation, self-supervised learning on graphs, explainability of graph
neural networks, and deep learning on 3D graphs. For each direction, we provide
unified implementations of data interfaces, common algorithms, and evaluation
metrics. Altogether, DIG is an extensible, open-source, and turnkey library for
researchers to develop new methods and effortlessly compare with common
baselines using widely used datasets and evaluation metrics. Source code and
documentations are available at https://github.com/divelab/DIG/.
Related papers
- Continual Learning on Graphs: Challenges, Solutions, and Opportunities [72.7886669278433]
We provide a comprehensive review of existing continual graph learning (CGL) algorithms.
We compare methods with traditional continual learning techniques and analyze the applicability of the traditional continual learning techniques to forgetting tasks.
We will maintain an up-to-date repository featuring a comprehensive list of accessible algorithms.
arXiv Detail & Related papers (2024-02-18T12:24:45Z) - MGNet: Learning Correspondences via Multiple Graphs [78.0117352211091]
Learning correspondences aims to find correct correspondences from the initial correspondence set with an uneven correspondence distribution and a low inlier rate.
Recent advances usually use graph neural networks (GNNs) to build a single type of graph or stack local graphs into the global one to complete the task.
We propose MGNet to effectively combine multiple complementary graphs.
arXiv Detail & Related papers (2024-01-10T07:58:44Z) - Counterfactual Learning on Graphs: A Survey [34.47646823407408]
Graph neural networks (GNNs) have achieved great success in representation learning on graphs.
Counterfactual learning on graphs has shown promising results in alleviating these drawbacks.
Various approaches have been proposed for counterfactual fairness, explainability, link prediction and other applications on graphs.
arXiv Detail & Related papers (2023-04-03T21:42:42Z) - pyGSL: A Graph Structure Learning Toolkit [14.000763778781547]
pyGSL is a Python library that provides efficient implementations of state-of-the-art graph structure learning models.
pyGSL is written in GPU-friendly ways, allowing one to scale to much larger network tasks.
arXiv Detail & Related papers (2022-11-07T14:23:10Z) - CogDL: A Comprehensive Library for Graph Deep Learning [55.694091294633054]
We present CogDL, a library for graph deep learning that allows researchers and practitioners to conduct experiments, compare methods, and build applications with ease and efficiency.
In CogDL, we propose a unified design for the training and evaluation of GNN models for various graph tasks, making it unique among existing graph learning libraries.
We develop efficient sparse operators for CogDL, enabling it to become the most competitive graph library for efficiency.
arXiv Detail & Related papers (2021-03-01T12:35:16Z) - DeepWalking Backwards: From Embeddings Back to Graphs [22.085932117823738]
We study whether embeddings can be inverted to (approximately) recover the graph used to generate them.
We present algorithms for accurate embedding inversion - i.e., from the low-dimensional embedding of a graph G, we can find a graph H with a very similar embedding.
Our findings are a step towards a more rigorous understanding of exactly what information embeddings encode about the input graph, and why this information is useful for learning tasks.
arXiv Detail & Related papers (2021-02-17T02:16:12Z) - Graph Traversal with Tensor Functionals: A Meta-Algorithm for Scalable
Learning [29.06880988563846]
Graph Traversal via Functionals(GTTF) is a unifying meta-algorithm framework for embedding graph algorithms.
We show for a wide class of methods, our learns in an unbiased fashion and, in expectation, approximates the learning as if the specialized implementations were run directly.
arXiv Detail & Related papers (2021-02-08T16:52:52Z) - Efficient Graph Deep Learning in TensorFlow with tf_geometric [53.237754811019464]
We introduce tf_geometric, an efficient and friendly library for graph deep learning.
tf_geometric provides kernel libraries for building Graph Neural Networks (GNNs) as well as implementations of popular GNNs.
The kernel libraries consist of infrastructures for building efficient GNNs, including graph data structures, graph map-reduce framework, graph mini-batch strategy, etc.
arXiv Detail & Related papers (2021-01-27T17:16:36Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.