GraphHD: Efficient graph classification using hyperdimensional computing
- URL: http://arxiv.org/abs/2205.07826v1
- Date: Mon, 16 May 2022 17:32:58 GMT
- Title: GraphHD: Efficient graph classification using hyperdimensional computing
- Authors: Igor Nunes, Mike Heddes, Tony Givargis, Alexandru Nicolau, Alex
Veidenbaum
- Abstract summary: We present a baseline approach for graph classification with HDC.
We evaluate GraphHD on real-world graph classification problems.
Our results show that when compared to the state-of-the-art Graph Neural Networks (GNNs) the proposed model achieves comparable accuracy.
- Score: 58.720142291102135
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hyperdimensional Computing (HDC) developed by Kanerva is a computational
model for machine learning inspired by neuroscience. HDC exploits
characteristics of biological neural systems such as high-dimensionality,
randomness and a holographic representation of information to achieve a good
balance between accuracy, efficiency and robustness. HDC models have already
been proven to be useful in different learning applications, especially in
resource-limited settings such as the increasingly popular Internet of Things
(IoT). One class of learning tasks that is missing from the current body of
work on HDC is graph classification. Graphs are among the most important forms
of information representation, yet, to this day, HDC algorithms have not been
applied to the graph learning problem in a general sense. Moreover, graph
learning in IoT and sensor networks, with limited compute capabilities,
introduce challenges to the overall design methodology. In this paper, we
present GraphHD$-$a baseline approach for graph classification with HDC. We
evaluate GraphHD on real-world graph classification problems. Our results show
that when compared to the state-of-the-art Graph Neural Networks (GNNs) the
proposed model achieves comparable accuracy, while training and inference times
are on average 14.6$\times$ and 2.0$\times$ faster, respectively.
Related papers
- Molecular Classification Using Hyperdimensional Graph Classification [41.38562343472387]
This work introduces an innovative approach to graph learning by leveraging Hyperdimensional Computing.
An important application within this domain involves the identification of cancerous cells across diverse molecular structures.
We propose an HDC-based model that demonstrates comparable Area Under the Curve results when compared to state-of-the-art models like Graph Neural Networks (GNNs) or the Weisfieler-Lehman graph kernel (WL)
arXiv Detail & Related papers (2024-03-18T23:16:17Z) - Hyperdimensional Computing for Node Classification and Link Prediction [0.0]
We introduce a novel method for transductive learning on graphs using hyperdimensional representations.
The proposed approach encodes data samples using random projections into a very high-dimensional space.
It obviates the need for expensive iterative training of the sort required by deep learning methods.
arXiv Detail & Related papers (2024-02-26T23:15:01Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - A Survey on Graph Representation Learning Methods [7.081604594416337]
The goal of graph representation learning is to generate graph representation vectors that capture the structure and features of large graphs accurately.
Two of the most prevalent categories of graph representation learning are graph embedding methods without using graph neural nets (GNN) and graph neural nets (GNN) based methods.
arXiv Detail & Related papers (2022-04-04T21:18:48Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Hierarchical Adaptive Pooling by Capturing High-order Dependency for
Graph Representation Learning [18.423192209359158]
Graph neural networks (GNN) have been proven to be mature enough for handling graph-structured data on node-level graph representation learning tasks.
This paper proposes a hierarchical graph-level representation learning framework, which is adaptively sensitive to graph structures.
arXiv Detail & Related papers (2021-04-13T06:22:24Z) - Graph Contrastive Learning with Augmentations [109.23158429991298]
We propose a graph contrastive learning (GraphCL) framework for learning unsupervised representations of graph data.
We show that our framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-10-22T20:13:43Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Differentiable Graph Module (DGM) for Graph Convolutional Networks [44.26665239213658]
Differentiable Graph Module (DGM) is a learnable function that predicts edge probabilities in the graph which are optimal for the downstream task.
We provide an extensive evaluation of applications from the domains of healthcare (disease prediction), brain imaging (age prediction), computer graphics (3D point cloud segmentation), and computer vision (zero-shot learning)
arXiv Detail & Related papers (2020-02-11T12:59:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.