Hyperbolic Diffusion Embedding and Distance for Hierarchical
Representation Learning
- URL: http://arxiv.org/abs/2305.18962v1
- Date: Tue, 30 May 2023 11:49:39 GMT
- Title: Hyperbolic Diffusion Embedding and Distance for Hierarchical
Representation Learning
- Authors: Ya-Wei Eileen Lin, Ronald R. Coifman, Gal Mishne, Ronen Talmon
- Abstract summary: This paper presents a new method for hierarchical data embedding and distance.
Our method relies on combining diffusion geometry, a central approach to manifold learning, and hyperbolic geometry.
We show theoretically that our embedding and distance recover the underlying hierarchical structure.
- Score: 13.976918651426205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Finding meaningful representations and distances of hierarchical data is
important in many fields. This paper presents a new method for hierarchical
data embedding and distance. Our method relies on combining diffusion geometry,
a central approach to manifold learning, and hyperbolic geometry. Specifically,
using diffusion geometry, we build multi-scale densities on the data, aimed to
reveal their hierarchical structure, and then embed them into a product of
hyperbolic spaces. We show theoretically that our embedding and distance
recover the underlying hierarchical structure. In addition, we demonstrate the
efficacy of the proposed method and its advantages compared to existing methods
on graph embedding benchmarks and hierarchical datasets.
Related papers
- Dissecting embedding method: learning higher-order structures from data [0.0]
Geometric deep learning methods for data learning often include set of assumptions on the geometry of the feature space.
These assumptions together with data being discrete and finite can cause some generalisations, which are likely to create wrong interpretations of the data and models outputs.
arXiv Detail & Related papers (2024-10-14T08:19:39Z) - Disentangled Hyperbolic Representation Learning for Heterogeneous Graphs [29.065531121422204]
We propose $textDis-H2textGCN$, a Disentangled Hyperbolic Heterogeneous Graph Convolutional Network.
We evaluate our proposed $textDis-H2textGCN$ on five real-world heterogeneous graph datasets.
arXiv Detail & Related papers (2024-06-14T18:50:47Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - A Heat Diffusion Perspective on Geodesic Preserving Dimensionality
Reduction [66.21060114843202]
We propose a more general heat kernel based manifold embedding method that we call heat geodesic embeddings.
Results show that our method outperforms existing state of the art in preserving ground truth manifold distances.
We also showcase our method on single cell RNA-sequencing datasets with both continuum and cluster structure.
arXiv Detail & Related papers (2023-05-30T13:58:50Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Deep Recursive Embedding for High-Dimensional Data [9.611123249318126]
We propose to combine deep neural networks (DNN) with mathematics-guided embedding rules for high-dimensional data embedding.
We introduce a generic deep embedding network (DEN) framework, which is able to learn a parametric mapping from high-dimensional space to low-dimensional space.
arXiv Detail & Related papers (2021-10-31T23:22:33Z) - Enhancing Hierarchical Information by Using Metric Cones for Graph
Embedding [3.700709497727248]
Poincar'e embedding has been proposed to capture the hierarchical structure of graphs.
Most of the existing methods have isometric mappings in the embedding space.
We propose graph embedding in a metric cone to solve such a problem.
arXiv Detail & Related papers (2021-02-16T08:23:59Z) - Quadric hypersurface intersection for manifold learning in feature space [52.83976795260532]
manifold learning technique suitable for moderately high dimension and large datasets.
The technique is learned from the training data in the form of an intersection of quadric hypersurfaces.
At test time, this manifold can be used to introduce an outlier score for arbitrary new points.
arXiv Detail & Related papers (2021-02-11T18:52:08Z) - Spatial Pyramid Based Graph Reasoning for Semantic Segmentation [67.47159595239798]
We apply graph convolution into the semantic segmentation task and propose an improved Laplacian.
The graph reasoning is directly performed in the original feature space organized as a spatial pyramid.
We achieve comparable performance with advantages in computational and memory overhead.
arXiv Detail & Related papers (2020-03-23T12:28:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.