Not Half Bad: Exploring Half-Precision in Graph Convolutional Neural
Networks
- URL: http://arxiv.org/abs/2010.12635v1
- Date: Fri, 23 Oct 2020 19:47:42 GMT
- Title: Not Half Bad: Exploring Half-Precision in Graph Convolutional Neural
Networks
- Authors: John Brennan, Stephen Bonner, Amir Atapour-Abarghouei, Philip T
Jackson, Boguslaw Obara, Andrew Stephen McGough
- Abstract summary: efficient graph analysis using modern machine learning is receiving a growing level of attention.
Deep learning approaches often operate over the entire adjacency matrix.
It is desirable to identify efficient measures to reduce both run-time and memory requirements.
- Score: 8.460826851547294
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the growing significance of graphs as an effective representation of
data in numerous applications, efficient graph analysis using modern machine
learning is receiving a growing level of attention. Deep learning approaches
often operate over the entire adjacency matrix -- as the input and intermediate
network layers are all designed in proportion to the size of the adjacency
matrix -- leading to intensive computation and large memory requirements as the
graph size increases. It is therefore desirable to identify efficient measures
to reduce both run-time and memory requirements allowing for the analysis of
the largest graphs possible. The use of reduced precision operations within the
forward and backward passes of a deep neural network along with novel
specialised hardware in modern GPUs can offer promising avenues towards
efficiency. In this paper, we provide an in-depth exploration of the use of
reduced-precision operations, easily integrable into the highly popular PyTorch
framework, and an analysis of the effects of Tensor Cores on graph
convolutional neural networks. We perform an extensive experimental evaluation
of three GPU architectures and two widely-used graph analysis tasks (vertex
classification and link prediction) using well-known benchmark and
synthetically generated datasets. Thus allowing us to make important
observations on the effects of reduced-precision operations and Tensor Cores on
computational and memory usage of graph convolutional neural networks -- often
neglected in the literature.
Related papers
- Transductive Spiking Graph Neural Networks for Loihi [0.8684584813982095]
We present a fully neuromorphic implementation of spiking graph neural networks designed for Loihi 2.
We showcase the performance benefits of combining neuromorphic Bayesian optimization with our approach for citation graph classification using fixed-precision spiking neurons.
arXiv Detail & Related papers (2024-04-25T21:15:15Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - ItNet: iterative neural networks with small graphs for accurate and
efficient anytime prediction [1.52292571922932]
In this study, we introduce a class of network models that have a small memory footprint in terms of their computational graphs.
We show state-of-the-art results for semantic segmentation on the CamVid and Cityscapes datasets.
arXiv Detail & Related papers (2021-01-21T15:56:29Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.