CogDL: A Comprehensive Library for Graph Deep Learning
- URL: http://arxiv.org/abs/2103.00959v4
- Date: Mon, 17 Apr 2023 10:44:27 GMT
- Title: CogDL: A Comprehensive Library for Graph Deep Learning
- Authors: Yukuo Cen, Zhenyu Hou, Yan Wang, Qibin Chen, Yizhen Luo, Zhongming Yu,
Hengrui Zhang, Xingcheng Yao, Aohan Zeng, Shiguang Guo, Yuxiao Dong, Yang
Yang, Peng Zhang, Guohao Dai, Yu Wang, Chang Zhou, Hongxia Yang, Jie Tang
- Abstract summary: We present CogDL, a library for graph deep learning that allows researchers and practitioners to conduct experiments, compare methods, and build applications with ease and efficiency.
In CogDL, we propose a unified design for the training and evaluation of GNN models for various graph tasks, making it unique among existing graph learning libraries.
We develop efficient sparse operators for CogDL, enabling it to become the most competitive graph library for efficiency.
- Score: 55.694091294633054
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have attracted tremendous attention from the
graph learning community in recent years. It has been widely adopted in various
real-world applications from diverse domains, such as social networks and
biological graphs. The research and applications of graph deep learning present
new challenges, including the sparse nature of graph data, complicated training
of GNNs, and non-standard evaluation of graph tasks. To tackle the issues, we
present CogDL, a comprehensive library for graph deep learning that allows
researchers and practitioners to conduct experiments, compare methods, and
build applications with ease and efficiency. In CogDL, we propose a unified
design for the training and evaluation of GNN models for various graph tasks,
making it unique among existing graph learning libraries. By utilizing this
unified trainer, CogDL can optimize the GNN training loop with several training
techniques, such as mixed precision training. Moreover, we develop efficient
sparse operators for CogDL, enabling it to become the most competitive graph
library for efficiency. Another important CogDL feature is its focus on ease of
use with the aim of facilitating open and reproducible research of graph
learning. We leverage CogDL to report and maintain benchmark results on
fundamental graph tasks, which can be reproduced and directly used by the
community.
Related papers
- Graph Structure Prompt Learning: A Novel Methodology to Improve Performance of Graph Neural Networks [13.655670509818144]
We propose a novel Graph structure Prompt Learning method (GPL) to enhance the training of Graph networks (GNNs)
GPL employs task-independent graph structure losses to encourage GNNs to learn intrinsic graph characteristics while simultaneously solving downstream tasks.
In experiments on eleven real-world datasets, after being trained by neural prediction, GNNs significantly outperform their original performance on node classification, graph classification, and edge tasks.
arXiv Detail & Related papers (2024-07-16T03:59:18Z) - MuseGraph: Graph-oriented Instruction Tuning of Large Language Models
for Generic Graph Mining [41.19687587548107]
Graph Neural Networks (GNNs) need to be re-trained every time when applied to different graph tasks and datasets.
We propose a novel framework MuseGraph, which seamlessly integrates the strengths of GNNs and Large Language Models (LLMs)
Our experimental results demonstrate significant improvements in different graph tasks.
arXiv Detail & Related papers (2024-03-02T09:27:32Z) - LasTGL: An Industrial Framework for Large-Scale Temporal Graph Learning [61.4707298969173]
We introduce LasTGL, an industrial framework that integrates unified and unified implementations of common temporal graph learning algorithms.
LasTGL provides comprehensive temporal graph datasets, TGNN models and utilities along with well-documented tutorials.
arXiv Detail & Related papers (2023-11-28T08:45:37Z) - An Empirical Study of Retrieval-enhanced Graph Neural Networks [48.99347386689936]
Graph Neural Networks (GNNs) are effective tools for graph representation learning.
We propose a retrieval-enhanced scheme called GRAPHRETRIEVAL, which is agnostic to the choice of graph neural network models.
We conduct comprehensive experiments over 13 datasets, and we observe that GRAPHRETRIEVAL is able to reach substantial improvements over existing GNNs.
arXiv Detail & Related papers (2022-06-01T09:59:09Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - GraphTheta: A Distributed Graph Neural Network Learning System With
Flexible Training Strategy [5.466414428765544]
We present a new distributed graph learning system GraphTheta.
It supports multiple training strategies and enables efficient and scalable learning on big graphs.
This work represents the largest edge-attributed GNN learning task conducted on a billion-scale network in the literature.
arXiv Detail & Related papers (2021-04-21T14:51:33Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.