Neural Architecture Search in Graph Neural Networks
- URL: http://arxiv.org/abs/2008.00077v1
- Date: Fri, 31 Jul 2020 21:04:24 GMT
- Title: Neural Architecture Search in Graph Neural Networks
- Authors: Matheus Nunes and Gisele L. Pappa
- Abstract summary: This paper compares two NAS methods for optimizing Graph Neural Networks (GNN)
Results consider 7 datasets over two search spaces and show that both methods obtain similar accuracies to a random search.
- Score: 1.2881413375147996
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Performing analytical tasks over graph data has become increasingly
interesting due to the ubiquity and large availability of relational
information. However, unlike images or sentences, there is no notion of
sequence in networks. Nodes (and edges) follow no absolute order, and it is
hard for traditional machine learning (ML) algorithms to recognize a pattern
and generalize their predictions on this type of data. Graph Neural Networks
(GNN) successfully tackled this problem. They became popular after the
generalization of the convolution concept to the graph domain. However, they
possess a large number of hyperparameters and their design and optimization is
currently hand-made, based on heuristics or empirical intuition. Neural
Architecture Search (NAS) methods appear as an interesting solution to this
problem. In this direction, this paper compares two NAS methods for optimizing
GNN: one based on reinforcement learning and a second based on evolutionary
algorithms. Results consider 7 datasets over two search spaces and show that
both methods obtain similar accuracies to a random search, raising the question
of how many of the search space dimensions are actually relevant to the
problem.
Related papers
- Towards Better Out-of-Distribution Generalization of Neural Algorithmic
Reasoning Tasks [51.8723187709964]
We study the OOD generalization of neural algorithmic reasoning tasks.
The goal is to learn an algorithm from input-output pairs using deep neural networks.
arXiv Detail & Related papers (2022-11-01T18:33:20Z) - Invertible Neural Networks for Graph Prediction [22.140275054568985]
In this work, we address conditional generation using deep invertible neural networks.
We adopt an end-to-end training approach since our objective is to address prediction and generation in the forward and backward processes at once.
arXiv Detail & Related papers (2022-06-02T17:28:33Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Scaling Graph Neural Networks with Approximate PageRank [64.92311737049054]
We present the PPRGo model which utilizes an efficient approximation of information diffusion in GNNs.
In addition to being faster, PPRGo is inherently scalable, and can be trivially parallelized for large datasets like those found in industry settings.
We show that training PPRGo and predicting labels for all nodes in this graph takes under 2 minutes on a single machine, far outpacing other baselines on the same graph.
arXiv Detail & Related papers (2020-07-03T09:30:07Z) - Neural Bipartite Matching [19.600193617583955]
This report describes how neural execution is applied to a complex algorithm.
It is achieved via neural execution based only on features generated from a single GNN.
The evaluation shows strongly generalising results with the network achieving optimal matching almost 100% of the time.
arXiv Detail & Related papers (2020-05-22T17:50:38Z) - Which way? Direction-Aware Attributed Graph Embedding [2.429993132301275]
Graph embedding algorithms are used to efficiently represent a graph in a continuous vector space.
One aspect that is often overlooked is whether the graph is directed or not.
This study presents a novel text-enriched, direction-aware algorithm called DIAGRAM.
arXiv Detail & Related papers (2020-01-30T13:08:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.