Multi-Hyperbolic Space-based Heterogeneous Graph Attention Network
- URL: http://arxiv.org/abs/2411.11283v1
- Date: Mon, 18 Nov 2024 04:55:26 GMT
- Title: Multi-Hyperbolic Space-based Heterogeneous Graph Attention Network
- Authors: Jongmin Park, Seunghoon Han, Jong-Ryul Lee, Sungsu Lim,
- Abstract summary: We propose Multi-hyperbolic Space-based heterogeneous Graph Attention Network (MSGAT) to capture diverse power-law structures within heterogeneous graphs.
MSGAT outperforms state-of-the-art baselines in various graph machine learning tasks.
- Score: 5.816451272912859
- License:
- Abstract: To leverage the complex structures within heterogeneous graphs, recent studies on heterogeneous graph embedding use a hyperbolic space, characterized by a constant negative curvature and exponentially increasing space, which aligns with the structural properties of heterogeneous graphs. However, despite heterogeneous graphs inherently possessing diverse power-law structures, most hyperbolic heterogeneous graph embedding models use a single hyperbolic space for the entire heterogeneous graph, which may not effectively capture the diverse power-law structures within the heterogeneous graph. To address this limitation, we propose Multi-hyperbolic Space-based heterogeneous Graph Attention Network (MSGAT), which uses multiple hyperbolic spaces to effectively capture diverse power-law structures within heterogeneous graphs. We conduct comprehensive experiments to evaluate the effectiveness of MSGAT. The experimental results demonstrate that MSGAT outperforms state-of-the-art baselines in various graph machine learning tasks, effectively capturing the complex structures of heterogeneous graphs.
Related papers
- GraphMoRE: Mitigating Topological Heterogeneity via Mixture of Riemannian Experts [13.701637246257707]
Real-world graphs have inherently complex and diverse topological patterns, known as topological heterogeneity.
Most existing works learn graph representation in a single constant curvature space that is insufficient to match the complex geometric shapes, resulting in low-quality embeddings with high distortion.
arXiv Detail & Related papers (2024-12-15T06:52:40Z) - Heterogeneous Graph Contrastive Learning with Spectral Augmentation [15.231689595121553]
This paper introduces a spectral-enhanced graph contrastive learning model (SHCL) for the first time in heterogeneous graph neural networks.
The proposed model learns an adaptive topology augmentation scheme through the heterogeneous graph itself.
Experimental results on multiple real-world datasets demonstrate substantial advantages of the proposed model.
arXiv Detail & Related papers (2024-06-30T14:20:12Z) - Advancing Graph Generation through Beta Diffusion [49.49740940068255]
Graph Beta Diffusion (GBD) is a generative model specifically designed to handle the diverse nature of graph data.
We propose a modulation technique that enhances the realism of generated graphs by stabilizing critical graph topology.
arXiv Detail & Related papers (2024-06-13T17:42:57Z) - Hyperbolic Heterogeneous Graph Attention Networks [3.0165549581582454]
Most previous heterogeneous graph embedding models represent elements in a heterogeneous graph as vector representations in a low-dimensional Euclidean space.
We propose Hyperbolic Heterogeneous Graph Attention Networks (HHGAT) that learn vector representations in hyperbolic spaces with meta-path instances.
We conducted experiments on three real-world heterogeneous graph datasets, demonstrating that HHGAT outperforms state-of-the-art heterogeneous graph embedding models in node classification and clustering tasks.
arXiv Detail & Related papers (2024-04-15T04:45:49Z) - Hetero$^2$Net: Heterophily-aware Representation Learning on
Heterogenerous Graphs [38.858702539146385]
We present Hetero$2$Net, a heterophily-aware HGNN that incorporates both masked metapath prediction and masked label prediction tasks.
We evaluate the performance of Hetero$2$Net on five real-world heterogeneous graph benchmarks with varying levels of heterophily.
arXiv Detail & Related papers (2023-10-18T02:19:12Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Hybrid Micro/Macro Level Convolution for Heterogeneous Graph Learning [45.14314180743549]
Heterogeneous graphs are pervasive in practical scenarios, where each graph consists of multiple types of nodes and edges.
Most of the existing graph convolution approaches were designed for homogeneous graphs, and therefore cannot handle heterogeneous graphs.
We propose HGConv, a novel Heterogeneous Graph Convolution approach, to learn comprehensive node representations on heterogeneous graphs.
arXiv Detail & Related papers (2020-12-29T12:12:37Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.