Hyperbolic Manifold Regression
- URL: http://arxiv.org/abs/2005.13885v1
- Date: Thu, 28 May 2020 10:16:30 GMT
- Title: Hyperbolic Manifold Regression
- Authors: Gian Maria Marconi, Lorenzo Rosasco and Carlo Ciliberto
- Abstract summary: We consider the problem of performing manifold-valued regression onto an hyperbolic space as an intermediate component for a number of relevant machine learning applications.
We propose a novel perspective on two challenging tasks: 1) hierarchical classification via label embeddings and 2) taxonomy extension of hyperbolic representations.
Our experiments show that the strategy of leveraging the hyperbolic geometry is promising.
- Score: 33.40757136529844
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Geometric representation learning has recently shown great promise in several
machine learning settings, ranging from relational learning to language
processing and generative models. In this work, we consider the problem of
performing manifold-valued regression onto an hyperbolic space as an
intermediate component for a number of relevant machine learning applications.
In particular, by formulating the problem of predicting nodes of a tree as a
manifold regression task in the hyperbolic space, we propose a novel
perspective on two challenging tasks: 1) hierarchical classification via label
embeddings and 2) taxonomy extension of hyperbolic representations. To address
the regression problem we consider previous methods as well as proposing two
novel approaches that are computationally more advantageous: a parametric deep
learning model that is informed by the geodesics of the target space and a
non-parametric kernel-method for which we also prove excess risk bounds. Our
experiments show that the strategy of leveraging the hyperbolic geometry is
promising. In particular, in the taxonomy expansion setting, we find that the
hyperbolic-based estimators significantly outperform methods performing
regression in the ambient Euclidean space.
Related papers
- From Semantics to Hierarchy: A Hybrid Euclidean-Tangent-Hyperbolic Space Model for Temporal Knowledge Graph Reasoning [1.1372536310854844]
Temporal knowledge graph (TKG) reasoning predicts future events based on historical data.
Existing Euclidean models excel at capturing semantics but struggle with hierarchy.
We propose a novel hybrid geometric space approach that leverages the strengths of both Euclidean and hyperbolic models.
arXiv Detail & Related papers (2024-08-30T10:33:08Z) - Deep Fréchet Regression [4.915744683251151]
We propose a flexible regression model capable of handling high-dimensional predictors without imposing parametric assumptions.
The proposed model outperforms existing methods for non-Euclidean responses.
arXiv Detail & Related papers (2024-07-31T07:54:14Z) - Optimizing Curvature Learning for Robust Hyperbolic Deep Learning in Computer Vision [3.3964154468907486]
We introduce an improved schema for popular learning algorithms and a novel normalization approach to constrain embeddings within the variable representative radius of the manifold.
Our approach demonstrates consistent performance improvements across both direct classification and hierarchical metric learning tasks while allowing for larger hyperbolic models.
arXiv Detail & Related papers (2024-05-22T20:30:14Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Hyperbolic Representation Learning: Revisiting and Advancing [43.1661098138936]
We introduce a position-tracking mechanism to scrutinize existing prevalent hlms, revealing that the learned representations are sub-optimal and unsatisfactory.
We propose a simple yet effective method, hyperbolic informed embedding (HIE), by incorporating cost-free hierarchical information deduced from the hyperbolic distance of the node to origin.
Our method achieves a remarkable improvement of up to 21.4% compared to the competing baselines.
arXiv Detail & Related papers (2023-06-15T13:25:39Z) - Understanding Augmentation-based Self-Supervised Representation Learning
via RKHS Approximation and Regression [53.15502562048627]
Recent work has built the connection between self-supervised learning and the approximation of the top eigenspace of a graph Laplacian operator.
This work delves into a statistical analysis of augmentation-based pretraining.
arXiv Detail & Related papers (2023-06-01T15:18:55Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Hyperbolic Vision Transformers: Combining Improvements in Metric
Learning [116.13290702262248]
We propose a new hyperbolic-based model for metric learning.
At the core of our method is a vision transformer with output embeddings mapped to hyperbolic space.
We evaluate the proposed model with six different formulations on four datasets.
arXiv Detail & Related papers (2022-03-21T09:48:23Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z) - Differentiating through the Fr\'echet Mean [51.32291896926807]
Fr'echet mean is a generalization of the Euclidean mean.
We show how to differentiate through the Fr'echet mean for arbitrary Riemannian manifold.
This fully integrates the Fr'echet mean into the hyperbolic neural network pipeline.
arXiv Detail & Related papers (2020-02-29T19:49:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.