Non-linear Embeddings in Hilbert Simplex Geometry
- URL: http://arxiv.org/abs/2203.11434v3
- Date: Wed, 16 Aug 2023 05:39:43 GMT
- Title: Non-linear Embeddings in Hilbert Simplex Geometry
- Authors: Frank Nielsen and Ke Sun
- Abstract summary: A key technique of machine learning and computer vision is to embed discrete weighted graphs into continuous spaces for further downstream processing.
In this paper, we consider Hilbert geometry for the standard simplex which is isometric to a vector space equipped with the variation polytope norm.
Our findings demonstrate that Hilbert simplex geometry is competitive to alternative geometries such as the Poincar'e hyperbolic ball or the Euclidean geometry for embedding tasks while being fast and numerically robust.
- Score: 9.4599552597135
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A key technique of machine learning and computer vision is to embed discrete
weighted graphs into continuous spaces for further downstream processing.
Embedding discrete hierarchical structures in hyperbolic geometry has proven
very successful since it was shown that any weighted tree can be embedded in
that geometry with arbitrary low distortion. Various optimization methods for
hyperbolic embeddings based on common models of hyperbolic geometry have been
studied. In this paper, we consider Hilbert geometry for the standard simplex
which is isometric to a vector space equipped with the variation polytope norm.
We study the representation power of this Hilbert simplex geometry by embedding
distance matrices of graphs. Our findings demonstrate that Hilbert simplex
geometry is competitive to alternative geometries such as the Poincar\'e
hyperbolic ball or the Euclidean geometry for embedding tasks while being fast
and numerically robust.
Related papers
- Adaptive Surface Normal Constraint for Geometric Estimation from Monocular Images [56.86175251327466]
We introduce a novel approach to learn geometries such as depth and surface normal from images while incorporating geometric context.
Our approach extracts geometric context that encodes the geometric variations present in the input image and correlates depth estimation with geometric constraints.
Our method unifies depth and surface normal estimations within a cohesive framework, which enables the generation of high-quality 3D geometry from images.
arXiv Detail & Related papers (2024-02-08T17:57:59Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Neural Latent Geometry Search: Product Manifold Inference via
Gromov-Hausdorff-Informed Bayesian Optimization [21.97865037637575]
We mathematically define this novel formulation and coin it as neural latent geometry search (NLGS)
We propose a novel notion of distance between candidate latent geometries based on the Gromov-Hausdorff distance from metric geometry.
We then design a graph search space based on the notion of smoothness between latent geometries and employ the calculated as an additional inductive bias.
arXiv Detail & Related papers (2023-09-09T14:29:22Z) - Exploring Data Geometry for Continual Learning [64.4358878435983]
We study continual learning from a novel perspective by exploring data geometry for the non-stationary stream of data.
Our method dynamically expands the geometry of the underlying space to match growing geometric structures induced by new data.
Experiments show that our method achieves better performance than baseline methods designed in Euclidean space.
arXiv Detail & Related papers (2023-04-08T06:35:25Z) - Geometry Interaction Knowledge Graph Embeddings [153.69745042757066]
We propose Geometry Interaction knowledge graph Embeddings (GIE), which learns spatial structures interactively between the Euclidean, hyperbolic and hyperspherical spaces.
Our proposed GIE can capture a richer set of relational information, model key inference patterns, and enable expressive semantic matching across entities.
arXiv Detail & Related papers (2022-06-24T08:33:43Z) - Learning with symmetric positive definite matrices via generalized
Bures-Wasserstein geometry [40.23168342389821]
We propose a novel generalization of the Bures-Wasserstein geometry, which we call the GBW geometry.
We provide a rigorous treatment to study various differential geometric notions on the proposed novel generalized geometry.
We also present experiments that illustrate the efficacy of the proposed GBW geometry over the BW geometry.
arXiv Detail & Related papers (2021-10-20T10:03:06Z) - Hermitian Symmetric Spaces for Graph Embeddings [0.0]
We learn continuous representations of graphs in spaces of symmetric matrices over C.
These spaces offer a rich geometry that simultaneously admits hyperbolic and Euclidean subspaces.
The proposed models are able to automatically adapt to very dissimilar arrangements without any apriori estimates of graph features.
arXiv Detail & Related papers (2021-05-11T18:14:52Z) - DSG-Net: Learning Disentangled Structure and Geometry for 3D Shape
Generation [98.96086261213578]
We introduce DSG-Net, a deep neural network that learns a disentangled structured and geometric mesh representation for 3D shapes.
This supports a range of novel shape generation applications with disentangled control, such as of structure (geometry) while keeping geometry (structure) unchanged.
Our method not only supports controllable generation applications but also produces high-quality synthesized shapes, outperforming state-of-the-art methods.
arXiv Detail & Related papers (2020-08-12T17:06:51Z) - Ultrahyperbolic Representation Learning [13.828165530602224]
In machine learning, data is usually represented in a (flat) Euclidean space where distances between points are along straight lines.
We propose a representation living on a pseudo-Riemannian manifold of constant nonzero curvature.
We provide the necessary learning tools in this geometry and extend gradient-based optimization techniques.
arXiv Detail & Related papers (2020-07-01T03:49:24Z) - Topological hyperbolic lattices [0.0]
We show how the quantized curvature and edge dominance of hyperbolic geometry affect topological phases.
We report a recipe for the construction of a Euclidean photonic platform that inherits the topological band properties of a hyperbolic lattice.
Our approach is applicable to general non-Euclidean geometry and enables the exploitation of infinite lattice degrees of freedom for band theory.
arXiv Detail & Related papers (2020-03-16T03:41:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.