kHGCN: Tree-likeness Modeling via Continuous and Discrete Curvature
Learning
- URL: http://arxiv.org/abs/2212.01793v3
- Date: Mon, 17 Jul 2023 12:16:57 GMT
- Title: kHGCN: Tree-likeness Modeling via Continuous and Discrete Curvature
Learning
- Authors: Menglin Yang, Min Zhou, Lujia Pan, Irwin King
- Abstract summary: This study endeavors to explore the curvature between discrete structure and continuous learning space, aiming at encoding the message conveyed by the network topology in the learning process.
A curvature-aware hyperbolic graph convolutional neural network, kappaHGCN, is proposed, which utilizes the curvature to guide message passing and improve long-range propagation.
- Score: 39.25873010585029
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prevalence of tree-like structures, encompassing hierarchical structures
and power law distributions, exists extensively in real-world applications,
including recommendation systems, ecosystems, financial networks, social
networks, etc. Recently, the exploitation of hyperbolic space for tree-likeness
modeling has garnered considerable attention owing to its exponential growth
volume. Compared to the flat Euclidean space, the curved hyperbolic space
provides a more amenable and embeddable room, especially for datasets
exhibiting implicit tree-like architectures. However, the intricate nature of
real-world tree-like data presents a considerable challenge, as it frequently
displays a heterogeneous composition of tree-like, flat, and circular regions.
The direct embedding of such heterogeneous structures into a homogeneous
embedding space (i.e., hyperbolic space) inevitably leads to heavy distortions.
To mitigate the aforementioned shortage, this study endeavors to explore the
curvature between discrete structure and continuous learning space, aiming at
encoding the message conveyed by the network topology in the learning process,
thereby improving tree-likeness modeling. To the end, a curvature-aware
hyperbolic graph convolutional neural network, \{kappa}HGCN, is proposed, which
utilizes the curvature to guide message passing and improve long-range
propagation. Extensive experiments on node classification and link prediction
tasks verify the superiority of the proposal as it consistently outperforms
various competitive models by a large margin.
Related papers
- Topological Neural Networks: Mitigating the Bottlenecks of Graph Neural
Networks via Higher-Order Interactions [1.994307489466967]
This work starts with a theoretical framework to reveal the impact of network's width, depth, and graph topology on the over-squashing phenomena in message-passing neural networks.
The work drifts towards, higher-order interactions and multi-relational inductive biases via Topological Neural Networks.
Inspired by Graph Attention Networks, two topological attention networks are proposed: Simplicial and Cell Attention Networks.
arXiv Detail & Related papers (2024-02-10T08:26:06Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Curve Your Attention: Mixed-Curvature Transformers for Graph
Representation Learning [77.1421343649344]
We propose a generalization of Transformers towards operating entirely on the product of constant curvature spaces.
We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges.
arXiv Detail & Related papers (2023-09-08T02:44:37Z) - Geometry Interaction Knowledge Graph Embeddings [153.69745042757066]
We propose Geometry Interaction knowledge graph Embeddings (GIE), which learns spatial structures interactively between the Euclidean, hyperbolic and hyperspherical spaces.
Our proposed GIE can capture a richer set of relational information, model key inference patterns, and enable expressive semantic matching across entities.
arXiv Detail & Related papers (2022-06-24T08:33:43Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Unit Ball Model for Hierarchical Embeddings in Complex Hyperbolic Space [28.349200177632852]
Learning the representation of data with hierarchical structures in the hyperbolic space attracts increasing attention in recent years.
We propose to learn the graph embeddings in the unit ball model of the complex hyperbolic space.
arXiv Detail & Related papers (2021-05-09T16:09:54Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.