HyLa: Hyperbolic Laplacian Features For Graph Learning
- URL: http://arxiv.org/abs/2202.06854v1
- Date: Mon, 14 Feb 2022 16:40:24 GMT
- Title: HyLa: Hyperbolic Laplacian Features For Graph Learning
- Authors: Tao Yu, Christopher De Sa
- Abstract summary: hyperbolic space can support embeddings of tree- and graph-structured data.
For graph learning, points in hyperbolic space have been used successfully as signals in deep neural networks.
Existing hyperbolic networks are computationally expensive and can be numerically unstable.
We propose HyLa, a completely different approach to using hyperbolic space in graph learning.
- Score: 44.33054069927441
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Due to its geometric properties, hyperbolic space can support high-fidelity
embeddings of tree- and graph-structured data. For graph learning, points in
hyperbolic space have been used successfully as signals in deep neural
networks: e.g. hyperbolic graph convolutional networks (GCN) can outperform
vanilla GCN. However, existing hyperbolic networks are computationally
expensive and can be numerically unstable, and cannot scale to large graphs due
to these shortcomings. In this paper, we propose HyLa, a completely different
approach to using hyperbolic space in graph learning: HyLa maps once from a
learned hyperbolic-space embedding to Euclidean space via the eigenfunctions of
the Laplacian operator in the hyperbolic space. Our method is inspired by the
random Fourier feature methodology, which uses the eigenfunctions of the
Laplacian in Euclidean space. We evaluate HyLa on downstream tasks including
node classification and text classification, where HyLa shows significant
improvements over hyperbolic GCN and other baselines.
Related papers
- Hyperbolic Delaunay Geometric Alignment [52.835250875177756]
We propose a similarity score for comparing datasets in a hyperbolic space.
The core idea is counting the edges of the hyperbolic Delaunay graph connecting datapoints across the given sets.
We provide an empirical investigation on synthetic and real-life biological data and demonstrate that HyperDGA outperforms the hyperbolic version of classical distances between sets.
arXiv Detail & Related papers (2024-04-12T17:14:58Z) - FFHR: Fully and Flexible Hyperbolic Representation for Knowledge Graph
Completion [45.470475498688344]
Some important operations in hyperbolic space still lack good definitions, making existing methods unable to fully leverage the merits of hyperbolic space.
We develop a textbfFully and textbfFlexible textbfHyperbolic textbfRepresentation framework (textbfFFHR) that is able to transfer recent Euclidean-based advances to hyperbolic space.
arXiv Detail & Related papers (2023-02-07T14:50:28Z) - Hyperbolic Graph Representation Learning: A Tutorial [39.25873010585029]
This tutorial aims to give an introduction to this emerging field of graph representation learning with the express purpose of being accessible to all audiences.
We first give a brief introduction to graph representation learning as well as some preliminaryian and hyperbolic geometry.
We then comprehensively revisit the technical details of the current hyperbolic graph neural networks by unifying them into a general framework.
arXiv Detail & Related papers (2022-11-08T07:15:29Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - Fully Hyperbolic Neural Networks [63.22521652077353]
We propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model.
We show that our method has better performance for building both shallow and deep networks.
arXiv Detail & Related papers (2021-05-31T03:36:49Z) - Lorentzian Graph Convolutional Networks [47.41609636856708]
We propose a novel hyperbolic graph convolutional network (LGCN) named Lorentzian graph convolutional network (LGCN)
LGCN rigorously guarantees the learned node features follow the hyperbolic geometry.
Experiments on six datasets show that LGCN performs better than the state-of-the-art methods.
arXiv Detail & Related papers (2021-04-15T14:14:25Z) - A Hyperbolic-to-Hyperbolic Graph Convolutional Network [46.80564170208473]
We propose a hyperbolic-to-hyperbolic graph convolutional network (H2H-GCN) that directly works on hyperbolic manifold.
The H2H-GCN achieves substantial improvements on the link prediction, node classification, and graph classification tasks.
arXiv Detail & Related papers (2021-04-14T16:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.