DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural
Networks
- URL: http://arxiv.org/abs/2010.03250v3
- Date: Mon, 27 Sep 2021 06:59:34 GMT
- Title: DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural
Networks
- Authors: Yuhui Ding, Quanming Yao, Huan Zhao, Tong Zhang
- Abstract summary: We search for a meta graph, which can capture more complex semantic relations than a meta path, to determine how graph neural networks propagate messages along different types of edges.
We design an expressive search space in the form of a directed acyclic graph (DAG) to represent candidate meta graphs for a HIN.
We propose a novel and efficient search algorithm to make the total search cost on a par with training a single GNN once.
- Score: 45.075163625895286
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a novel framework to automatically utilize
task-dependent semantic information which is encoded in heterogeneous
information networks (HINs). Specifically, we search for a meta graph, which
can capture more complex semantic relations than a meta path, to determine how
graph neural networks (GNNs) propagate messages along different types of edges.
We formalize the problem within the framework of neural architecture search
(NAS) and then perform the search in a differentiable manner. We design an
expressive search space in the form of a directed acyclic graph (DAG) to
represent candidate meta graphs for a HIN, and we propose task-dependent type
constraint to filter out those edge types along which message passing has no
effect on the representations of nodes that are related to the downstream task.
The size of the search space we define is huge, so we further propose a novel
and efficient search algorithm to make the total search cost on a par with
training a single GNN once. Compared with existing popular NAS algorithms, our
proposed search algorithm improves the search efficiency. We conduct extensive
experiments on different HINs and downstream tasks to evaluate our method, and
experimental results show that our method can outperform state-of-the-art
heterogeneous GNNs and also improves efficiency compared with those methods
which can implicitly learn meta paths.
Related papers
- Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Differentiable Meta Multigraph Search with Partial Message Propagation
on Heterogeneous Information Networks [18.104982772430102]
We propose a novel method called Partial Message Meta Multigraph search (PMMM) to automatically optimize the neural architecture design on Heterogeneous Information Networks (HINs)
PMMM adopts an efficient differentiable framework to search for a meaningful meta multigraph, which can capture more flexible and complex semantic relations than a meta graph.
Our approach outperforms the state-of-the-art heterogeneous GNNs, finds out meaningful meta multigraphs, and is significantly more stable.
arXiv Detail & Related papers (2022-11-27T07:35:42Z) - Subgraph Matching via Query-Conditioned Subgraph Matching Neural
Networks and Bi-Level Tree Search [33.9052190473029]
Subgraph Matching is a core operation in graph database search, biomedical analysis, social group finding, etc.
In this paper, we propose a novel encoder-decoder neural network architecture to dynamically compute the matching information between the query and the target graphs.
Experiments on five large real-world target graphs show that N-BLS can significantly improve the subgraph matching performance.
arXiv Detail & Related papers (2022-07-21T04:47:21Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Pooling Architecture Search for Graph Classification [36.728077433219916]
Graph neural networks (GNNs) are designed to learn node-level representation based on neighborhood aggregation schemes.
Pooling methods are applied after the aggregation operation to generate coarse-grained graphs.
It is a challenging problem to design a universal pooling architecture to perform well in most cases.
We propose to use neural architecture search (NAS) to search for adaptive pooling architectures for graph classification.
arXiv Detail & Related papers (2021-08-24T09:03:03Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - NASGEM: Neural Architecture Search via Graph Embedding Method [41.0658375655084]
We propose NASGEM which stands for Neural Architecture Search via Graph Embedding Method.
It is driven by a novel graph embedding method equipped with similarity measures to capture the graph topology information.
It consistently outperforms networks crafted by existing search methods in classification tasks.
arXiv Detail & Related papers (2020-07-08T21:58:37Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.