Hyperbolic Heterogeneous Graph Attention Networks
- URL: http://arxiv.org/abs/2404.09456v1
- Date: Mon, 15 Apr 2024 04:45:49 GMT
- Title: Hyperbolic Heterogeneous Graph Attention Networks
- Authors: Jongmin Park, Seunghoon Han, Soohwan Jeong, Sungsu Lim,
- Abstract summary: Most previous heterogeneous graph embedding models represent elements in a heterogeneous graph as vector representations in a low-dimensional Euclidean space.
We propose Hyperbolic Heterogeneous Graph Attention Networks (HHGAT) that learn vector representations in hyperbolic spaces with meta-path instances.
We conducted experiments on three real-world heterogeneous graph datasets, demonstrating that HHGAT outperforms state-of-the-art heterogeneous graph embedding models in node classification and clustering tasks.
- Score: 3.0165549581582454
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Most previous heterogeneous graph embedding models represent elements in a heterogeneous graph as vector representations in a low-dimensional Euclidean space. However, because heterogeneous graphs inherently possess complex structures, such as hierarchical or power-law structures, distortions can occur when representing them in Euclidean space. To overcome this limitation, we propose Hyperbolic Heterogeneous Graph Attention Networks (HHGAT) that learn vector representations in hyperbolic spaces with meta-path instances. We conducted experiments on three real-world heterogeneous graph datasets, demonstrating that HHGAT outperforms state-of-the-art heterogeneous graph embedding models in node classification and clustering tasks.
Related papers
- The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Heterogeneous Graph Contrastive Learning with Spectral Augmentation [15.231689595121553]
This paper introduces a spectral-enhanced graph contrastive learning model (SHCL) for the first time in heterogeneous graph neural networks.
The proposed model learns an adaptive topology augmentation scheme through the heterogeneous graph itself.
Experimental results on multiple real-world datasets demonstrate substantial advantages of the proposed model.
arXiv Detail & Related papers (2024-06-30T14:20:12Z) - Finding the Missing-half: Graph Complementary Learning for
Homophily-prone and Heterophily-prone Graphs [48.79929516665371]
Graphs with homophily-prone edges tend to connect nodes with the same class.
Heterophily-prone edges tend to build relationships between nodes with different classes.
Existing GNNs only take the original graph during training.
arXiv Detail & Related papers (2023-06-13T08:06:10Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Heterogeneous manifolds for curvature-aware graph embedding [6.3351090376024155]
Graph embeddings are used in a broad range of Graph ML applications.
The quality of such embeddings crucially depends on whether the geometry of the space matches that of the graph.
arXiv Detail & Related papers (2022-02-02T18:18:35Z) - Learning on heterogeneous graphs using high-order relations [37.64632406923687]
We propose an approach for learning on heterogeneous graphs without using meta-paths.
We decompose a heterogeneous graph into different homogeneous relation-type graphs, which are then combined to create higher-order relation-type representations.
arXiv Detail & Related papers (2021-03-29T12:02:47Z) - Hybrid Micro/Macro Level Convolution for Heterogeneous Graph Learning [45.14314180743549]
Heterogeneous graphs are pervasive in practical scenarios, where each graph consists of multiple types of nodes and edges.
Most of the existing graph convolution approaches were designed for homogeneous graphs, and therefore cannot handle heterogeneous graphs.
We propose HGConv, a novel Heterogeneous Graph Convolution approach, to learn comprehensive node representations on heterogeneous graphs.
arXiv Detail & Related papers (2020-12-29T12:12:37Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.