Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate
Heterogeneous Graph Neural Network
- URL: http://arxiv.org/abs/2202.09177v1
- Date: Fri, 18 Feb 2022 13:11:35 GMT
- Title: Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate
Heterogeneous Graph Neural Network
- Authors: Tianyu Zhao, Cheng Yang, Yibo Li, Quan Gan, Zhenyi Wang, Fengqi Liang,
Huan Zhao, Yingxia Shao, Xiao Wang, Chuan Shi
- Abstract summary: We propose a unified framework covering most HGNNs, consisting of three components: heterogeneous linear transformation, heterogeneous graph transformation, and heterogeneous message passing layer.
We then build a platform Space4HGNN by defining a design space for HGNNs based on the unified framework, which offers modularized components, reproducible implementations, and standardized evaluation for HGNNs.
- Score: 51.07168862821267
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous Graph Neural Network (HGNN) has been successfully employed in
various tasks, but we cannot accurately know the importance of different design
dimensions of HGNNs due to diverse architectures and applied scenarios.
Besides, in the research community of HGNNs, implementing and evaluating
various tasks still need much human effort. To mitigate these issues, we first
propose a unified framework covering most HGNNs, consisting of three
components: heterogeneous linear transformation, heterogeneous graph
transformation, and heterogeneous message passing layer. Then we build a
platform Space4HGNN by defining a design space for HGNNs based on the unified
framework, which offers modularized components, reproducible implementations,
and standardized evaluation for HGNNs. Finally, we conduct experiments to
analyze the effect of different designs. With the insights found, we distill a
condensed design space and verify its effectiveness.
Related papers
- A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - PaSca: a Graph Neural Architecture Search System under the Scalable
Paradigm [24.294196319217907]
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph-based tasks.
However, GNNs do not scale well to data size and message passing steps.
This paper proposes PasCa, a new paradigm and system that offers a principled approach to systemically construct and explore the design space for scalable GNNs.
arXiv Detail & Related papers (2022-03-01T17:26:50Z) - MGNN: Graph Neural Networks Inspired by Distance Geometry Problem [28.789684784093048]
Graph Neural Networks (GNNs) have emerged as a prominent research topic in the field of machine learning.
In this paper, we propose a GNN model inspired by the congruent-inphilic property of the classifiers in the classification phase of GNNs.
We extensively evaluate the effectiveness of our model through experiments conducted on both synthetic and real-world datasets.
arXiv Detail & Related papers (2022-01-31T04:15:42Z) - Are we really making much progress? Revisiting, benchmarking, and
refining heterogeneous graph neural networks [38.15094159495419]
We present a systematical reproduction of 12 recent Heterogeneous graph neural networks (HGNNs)
We find that the simple homogeneous GNNs, e.g., GCN and GAT, are largely underestimated due to improper settings.
To facilitate robust and reproducible HGNN research, we construct the Heterogeneous Graph Benchmark (HGB)
arXiv Detail & Related papers (2021-12-30T06:29:21Z) - Designing the Topology of Graph Neural Networks: A Novel Feature Fusion
Perspective [12.363386808994079]
We learn to design the topology of GNNs in a novel feature fusion perspective which is dubbed F$2$GNN.
We develop a neural architecture search method on top of the unified framework which contains a set of selection and fusion operations.
The performance gains on eight real-world datasets demonstrate the effectiveness of F$2$GNN.
arXiv Detail & Related papers (2021-12-29T13:06:12Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Design Space for Graph Neural Networks [81.88707703106232]
We study the architectural design space for Graph Neural Networks (GNNs) which consists of 315,000 different designs over 32 different predictive tasks.
Our key results include: (1) A comprehensive set of guidelines for designing well-performing GNNs; (2) while best GNN designs for different tasks vary significantly, the GNN task space allows for transferring the best designs across different tasks; (3) models discovered using our design space achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-11-17T18:59:27Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.