A Unification Framework for Euclidean and Hyperbolic Graph Neural
Networks
- URL: http://arxiv.org/abs/2206.04285v3
- Date: Tue, 6 Jun 2023 12:38:45 GMT
- Title: A Unification Framework for Euclidean and Hyperbolic Graph Neural
Networks
- Authors: Mehrdad Khatir, Nurendra Choudhary, Sutanay Choudhury, Khushbu
Agarwal, Chandan K. Reddy
- Abstract summary: Hyperbolic neural networks can effectively capture the inherent hierarchy of graph datasets.
They entangle multiple incongruent (gyro-)vector spaces within a layer, which makes them limited in terms of generalization and scalability.
We propose the Poincare disk model as our search space, and apply all approximations on the disk.
We demonstrate that our model not only leverages the power of Euclidean networks such as interpretability and efficient execution of various model components, but also outperforms both Euclidean and hyperbolic counterparts on various benchmarks.
- Score: 8.080621697426997
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hyperbolic neural networks can effectively capture the inherent hierarchy of
graph datasets, and consequently a powerful choice of GNNs. However, they
entangle multiple incongruent (gyro-)vector spaces within a layer, which makes
them limited in terms of generalization and scalability. In this work, we
propose the Poincare disk model as our search space, and apply all
approximations on the disk (as if the disk is a tangent space derived from the
origin), thus getting rid of all inter-space transformations. Such an approach
enables us to propose a hyperbolic normalization layer and to further simplify
the entire hyperbolic model to a Euclidean model cascaded with our hyperbolic
normalization layer. We applied our proposed nonlinear hyperbolic normalization
to the current state-of-the-art homogeneous and multi-relational graph
networks. We demonstrate that our model not only leverages the power of
Euclidean networks such as interpretability and efficient execution of various
model components, but also outperforms both Euclidean and hyperbolic
counterparts on various benchmarks. Our code is made publicly available at
https://github.com/oom-debugger/ijcai23.
Related papers
- Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Node-Specific Space Selection via Localized Geometric Hyperbolicity in
Graph Neural Networks [38.7842803074593]
Many graph neural networks have been developed to learn graph representations in either Euclidean or hyperbolic space.
In this paper, we analyze two notions of local hyperbolicity, describing the underlying local geometry.
We show that our model Joint Space Graph Neural Network (JSGNN) can leverage both Euclidean and hyperbolic spaces during learning.
arXiv Detail & Related papers (2023-03-03T06:04:42Z) - Hyperbolic Hierarchical Knowledge Graph Embeddings for Link Prediction
in Low Dimensions [11.260501547769636]
We propose a novel KGE model called $textbfHyp$erbolic $textbfH$ierarchical $textbfKGE$ (HypHKGE)
This model introduces attention-based learnable curvatures for hyperbolic space, which helps preserve rich semantic hierarchies.
Experiments demonstrate the effectiveness of the proposed HypHKGE model on the three benchmark datasets.
arXiv Detail & Related papers (2022-04-28T03:41:41Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN
Design [8.250374560598493]
Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently.
The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space.
We present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space.
arXiv Detail & Related papers (2021-12-03T03:20:27Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Fully Hyperbolic Neural Networks [63.22521652077353]
We propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model.
We show that our method has better performance for building both shallow and deep networks.
arXiv Detail & Related papers (2021-05-31T03:36:49Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Latent Variable Modelling with Hyperbolic Normalizing Flows [35.1659722563025]
We introduce a novel normalizing flow over hyperbolic VAEs and Euclidean normalizing flows.
Our approach achieves improved performance on density estimation, as well as reconstruction of real-world graph data.
arXiv Detail & Related papers (2020-02-15T07:44:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.