Ultrahyperbolic Representation Learning
- URL: http://arxiv.org/abs/2007.00211v5
- Date: Mon, 11 Jan 2021 02:49:43 GMT
- Title: Ultrahyperbolic Representation Learning
- Authors: Marc T. Law and Jos Stam
- Abstract summary: In machine learning, data is usually represented in a (flat) Euclidean space where distances between points are along straight lines.
We propose a representation living on a pseudo-Riemannian manifold of constant nonzero curvature.
We provide the necessary learning tools in this geometry and extend gradient-based optimization techniques.
- Score: 13.828165530602224
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In machine learning, data is usually represented in a (flat) Euclidean space
where distances between points are along straight lines. Researchers have
recently considered more exotic (non-Euclidean) Riemannian manifolds such as
hyperbolic space which is well suited for tree-like data. In this paper, we
propose a representation living on a pseudo-Riemannian manifold of constant
nonzero curvature. It is a generalization of hyperbolic and spherical
geometries where the nondegenerate metric tensor need not be positive definite.
We provide the necessary learning tools in this geometry and extend
gradient-based optimization techniques. More specifically, we provide
closed-form expressions for distances via geodesics and define a descent
direction to minimize some objective function. Our novel framework is applied
to graph representations.
Related papers
- Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Short and Straight: Geodesics on Differentiable Manifolds [6.85316573653194]
In this work, we first analyse existing methods for computing length-minimising geodesics.
Second, we propose a model-based parameterisation for distance fields and geodesic flows on continuous manifold.
Third, we develop a curvature-based training mechanism, sampling and scaling points in regions of the manifold exhibiting larger values of the Ricci scalar.
arXiv Detail & Related papers (2023-05-24T15:09:41Z) - Tight and fast generalization error bound of graph embedding in metric
space [54.279425319381374]
We show that graph embedding in non-Euclidean metric space can outperform that in Euclidean space with much smaller training data than the existing bound has suggested.
Our new upper bound is significantly tighter and faster than the existing one, which can be exponential to $R$ and $O(frac1S)$ at the fastest.
arXiv Detail & Related papers (2023-05-13T17:29:18Z) - Exploring Data Geometry for Continual Learning [64.4358878435983]
We study continual learning from a novel perspective by exploring data geometry for the non-stationary stream of data.
Our method dynamically expands the geometry of the underlying space to match growing geometric structures induced by new data.
Experiments show that our method achieves better performance than baseline methods designed in Euclidean space.
arXiv Detail & Related papers (2023-04-08T06:35:25Z) - Heterogeneous manifolds for curvature-aware graph embedding [6.3351090376024155]
Graph embeddings are used in a broad range of Graph ML applications.
The quality of such embeddings crucially depends on whether the geometry of the space matches that of the graph.
arXiv Detail & Related papers (2022-02-02T18:18:35Z) - Semi-Riemannian Graph Convolutional Networks [36.09315878397234]
We develop a principled Semi-Riemannian GCN that first models data in semi-Riemannian manifold of constant nonzero curvature.
Our method provides a geometric inductive bias that is sufficiently flexible to model mixed heterogeneous topologies like hierarchical graphs with cycles.
arXiv Detail & Related papers (2021-06-06T14:23:34Z) - Manifold learning with arbitrary norms [8.433233101044197]
We show that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
We show in a numerical simulation that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
arXiv Detail & Related papers (2020-12-28T10:24:30Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z) - Differentiating through the Fr\'echet Mean [51.32291896926807]
Fr'echet mean is a generalization of the Euclidean mean.
We show how to differentiate through the Fr'echet mean for arbitrary Riemannian manifold.
This fully integrates the Fr'echet mean into the hyperbolic neural network pipeline.
arXiv Detail & Related papers (2020-02-29T19:49:38Z) - Computationally Tractable Riemannian Manifolds for Graph Embeddings [10.420394952839242]
We show how to learn and optimize graph embeddings in certain curved Riemannian spaces.
Our results serve as new evidence for the benefits of non-Euclidean embeddings in machine learning pipelines.
arXiv Detail & Related papers (2020-02-20T10:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.