HyPE-GT: where Graph Transformers meet Hyperbolic Positional Encodings
- URL: http://arxiv.org/abs/2312.06576v1
- Date: Mon, 11 Dec 2023 18:00:27 GMT
- Title: HyPE-GT: where Graph Transformers meet Hyperbolic Positional Encodings
- Authors: Kushal Bose and Swagatam Das
- Abstract summary: We introduce an innovative and efficient framework that introduces a set of learnable positional encodings into the Transformer.
This approach empowers us to explore diverse options for optimal selection of PEs for specific downstream tasks.
We repurpose these positional encodings to mitigate the impact of over-smoothing in deep Graph Neural Networks (GNNs)
- Score: 19.78896931593813
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Transformers (GTs) facilitate the comprehension of graph-structured
data by calculating the self-attention of node pairs without considering node
position information. To address this limitation, we introduce an innovative
and efficient framework that introduces Positional Encodings (PEs) into the
Transformer, generating a set of learnable positional encodings in the
hyperbolic space, a non-Euclidean domain. This approach empowers us to explore
diverse options for optimal selection of PEs for specific downstream tasks,
leveraging hyperbolic neural networks or hyperbolic graph convolutional
networks. Additionally, we repurpose these positional encodings to mitigate the
impact of over-smoothing in deep Graph Neural Networks (GNNs). Comprehensive
experiments on molecular benchmark datasets, co-author, and co-purchase
networks substantiate the effectiveness of hyperbolic positional encodings in
enhancing the performance of deep GNNs.
Related papers
- Graph Attention for Heterogeneous Graphs with Positional Encoding [0.0]
Graph Neural Networks (GNNs) have emerged as the de facto standard for modeling graph data.
This work benchmarks various GNN architectures to identify the most effective methods for heterogeneous graphs.
Our findings reveal that graph attention networks excel in these tasks.
arXiv Detail & Related papers (2025-04-03T18:00:02Z) - Toward Relative Positional Encoding in Spiking Transformers [52.62008099390541]
Spiking neural networks (SNNs) are bio-inspired networks that model how neurons in the brain communicate through discrete spikes.
In this paper, we introduce an approximate method for relative positional encoding (RPE) in Spiking Transformers.
arXiv Detail & Related papers (2025-01-28T06:42:37Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - Binary Graph Convolutional Network with Capacity Exploration [58.99478502486377]
We propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node attributes.
Our Bi-GCN can reduce the memory consumption by an average of 31x for both the network parameters and input data, and accelerate the inference speed by an average of 51x.
arXiv Detail & Related papers (2022-10-24T12:05:17Z) - Graph Neural Network Based Node Deployment for Throughput Enhancement [20.56966053013759]
We propose a novel graph neural network (GNN) method for the network node deployment problem.
We show that an expressive GNN has the capacity to approximate both the function value and the traffic permutation, as a theoretic support for the proposed method.
arXiv Detail & Related papers (2022-08-19T08:06:28Z) - Rewiring with Positional Encodings for Graph Neural Networks [37.394229290996364]
Several recent works use positional encodings to extend receptive fields of graph neural network layers equipped with attention mechanisms.
We use positional encodings to expand receptive fields to $r$-hop neighborhoods.
We obtain improvements on a variety of models and datasets and reach competitive performance using traditional GNNs or graph Transformers.
arXiv Detail & Related papers (2022-01-29T22:26:02Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Graph Attention Networks with Positional Embeddings [7.552100672006174]
Graph Neural Networks (GNNs) are deep learning methods which provide the current state of the art performance in node classification tasks.
We propose a framework, termed Graph Attentional Networks with Positional Embeddings (GAT-POS), to enhance GATs with positional embeddings.
We show that GAT-POS reaches remarkable improvement compared to strong GNN baselines and recent structural embedding enhanced GNNs on non-homophilic graphs.
arXiv Detail & Related papers (2021-05-09T22:13:46Z) - A Deep Graph Wavelet Convolutional Neural Network for Semi-supervised
Node Classification [11.959997989844043]
Graph convolutional neural network provides good solutions for node classification and other tasks with non-Euclidean data.
We propose a new deep graph wavelet convolutional network (DeepGWC) for semi-supervised node classification tasks.
arXiv Detail & Related papers (2021-02-19T07:57:28Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.