Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin
- URL: http://arxiv.org/abs/2309.10013v1
- Date: Mon, 18 Sep 2023 14:51:46 GMT
- Title: Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin
- Authors: Gabriel Moreira, Manuel Marques, Jo\~ao Paulo Costeira, Alexander
Hauptmann
- Abstract summary: We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
- Score: 49.12496652756007
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent research in representation learning has shown that hierarchical data
lends itself to low-dimensional and highly informative representations in
hyperbolic space. However, even if hyperbolic embeddings have gathered
attention in image recognition, their optimization is prone to numerical
hurdles. Further, it remains unclear which applications stand to benefit the
most from the implicit bias imposed by hyperbolicity, when compared to
traditional Euclidean features. In this paper, we focus on prototypical
hyperbolic neural networks. In particular, the tendency of hyperbolic
embeddings to converge to the boundary of the Poincar\'e ball in high
dimensions and the effect this has on few-shot classification. We show that the
best few-shot results are attained for hyperbolic embeddings at a common
hyperbolic radius. In contrast to prior benchmark results, we demonstrate that
better performance can be achieved by a fixed-radius encoder equipped with the
Euclidean metric, regardless of the embedding dimension.
Related papers
- Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal
Prototypes [7.665392786787577]
We use hyperbolic representation space for self-supervised representation learning for prototype-based clustering approaches.
We extend the Masked Siamese Networks to operate on the Poincar'e ball model of hyperbolic space.
Unlike previous methods we project to the hyperbolic space at the output of the encoder network and utilise a hyperbolic projection head to ensure that the representations used for downstream tasks remain hyperbolic.
arXiv Detail & Related papers (2023-05-18T12:38:40Z) - The Numerical Stability of Hyperbolic Representation Learning [36.32817250000654]
We analyze the limitations of two popular models for the hyperbolic space, namely, the Poincar'e ball and the Lorentz model.
We extend this Euclidean parametrization to hyperbolic hyperplanes and exhibit its ability to improve the performance of hyperbolic SVM.
arXiv Detail & Related papers (2022-10-31T22:51:59Z) - A Unification Framework for Euclidean and Hyperbolic Graph Neural
Networks [8.080621697426997]
Hyperbolic neural networks can effectively capture the inherent hierarchy of graph datasets.
They entangle multiple incongruent (gyro-)vector spaces within a layer, which makes them limited in terms of generalization and scalability.
We propose the Poincare disk model as our search space, and apply all approximations on the disk.
We demonstrate that our model not only leverages the power of Euclidean networks such as interpretability and efficient execution of various model components, but also outperforms both Euclidean and hyperbolic counterparts on various benchmarks.
arXiv Detail & Related papers (2022-06-09T05:33:02Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Hyperbolic Vision Transformers: Combining Improvements in Metric
Learning [116.13290702262248]
We propose a new hyperbolic-based model for metric learning.
At the core of our method is a vision transformer with output embeddings mapped to hyperbolic space.
We evaluate the proposed model with six different formulations on four datasets.
arXiv Detail & Related papers (2022-03-21T09:48:23Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z) - Differentiating through the Fr\'echet Mean [51.32291896926807]
Fr'echet mean is a generalization of the Euclidean mean.
We show how to differentiate through the Fr'echet mean for arbitrary Riemannian manifold.
This fully integrates the Fr'echet mean into the hyperbolic neural network pipeline.
arXiv Detail & Related papers (2020-02-29T19:49:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.