Text Enriched Sparse Hyperbolic Graph Convolutional Networks
- URL: http://arxiv.org/abs/2207.02368v2
- Date: Thu, 7 Jul 2022 04:58:49 GMT
- Title: Text Enriched Sparse Hyperbolic Graph Convolutional Networks
- Authors: Nurendra Choudhary, Nikhil Rao, Karthik Subbian, Chandan K. Reddy
- Abstract summary: Graph Neural Networks (GNNs) and their hyperbolic variants provide a promising approach to encode such networks in a low-dimensional latent space.
We propose Text Enriched Sparse Hyperbolic Graph Convolution Network (TESH-GCN) to capture the graph's metapath structures using semantic signals.
Our model outperforms the current state-of-the-art approaches by a large margin on the task of link prediction.
- Score: 21.83127488157701
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Heterogeneous networks, which connect informative nodes containing text with
different edge types, are routinely used to store and process information in
various real-world applications. Graph Neural Networks (GNNs) and their
hyperbolic variants provide a promising approach to encode such networks in a
low-dimensional latent space through neighborhood aggregation and hierarchical
feature extraction, respectively. However, these approaches typically ignore
metapath structures and the available semantic information. Furthermore, these
approaches are sensitive to the noise present in the training data. To tackle
these limitations, in this paper, we propose Text Enriched Sparse Hyperbolic
Graph Convolution Network (TESH-GCN) to capture the graph's metapath structures
using semantic signals and further improve prediction in large heterogeneous
graphs. In TESH-GCN, we extract semantic node information, which successively
acts as a connection signal to extract relevant nodes' local neighborhood and
graph-level metapath features from the sparse adjacency tensor in a
reformulated hyperbolic graph convolution layer. These extracted features in
conjunction with semantic features from the language model (for robustness) are
used for the final downstream task. Experiments on various heterogeneous graph
datasets show that our model outperforms the current state-of-the-art
approaches by a large margin on the task of link prediction. We also report a
reduction in both the training time and model parameters compared to the
existing hyperbolic approaches through a reformulated hyperbolic graph
convolution. Furthermore, we illustrate the robustness of our model by
experimenting with different levels of simulated noise in both the graph
structure and text, and also, present a mechanism to explain TESH-GCN's
prediction by analyzing the extracted metapaths.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Topology-guided Hypergraph Transformer Network: Unveiling Structural Insights for Improved Representation [1.1606619391009658]
We propose a Topology-guided Hypergraph Transformer Network (THTN)
In this model, we first formulate a hypergraph from a graph while retaining its structural essence to learn higher-order relations within the graph.
We present a structure-aware self-attention mechanism that discovers the important nodes and hyperedges from both semantic and structural viewpoints.
arXiv Detail & Related papers (2023-10-14T20:08:54Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Generative Graph Neural Networks for Link Prediction [13.643916060589463]
Inferring missing links or detecting spurious ones based on observed graphs, known as link prediction, is a long-standing challenge in graph data analysis.
This paper proposes a novel and radically different link prediction algorithm based on the network reconstruction theory, called GraphLP.
Unlike the discriminative neural network models used for link prediction, GraphLP is generative, which provides a new paradigm for neural-network-based link prediction.
arXiv Detail & Related papers (2022-12-31T10:07:19Z) - Template based Graph Neural Network with Optimal Transport Distances [11.56532171513328]
Current Graph Neural Networks (GNN) architectures rely on two important components: node features embedding through message passing, and aggregation with a specialized form of pooling.
We propose in this work a novel point of view, which places distances to some learnable graph templates at the core of the graph representation.
This distance embedding is constructed thanks to an optimal transport distance: the Fused Gromov-Wasserstein (FGW) distance.
arXiv Detail & Related papers (2022-05-31T12:24:01Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.