HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal
Prototypes
- URL: http://arxiv.org/abs/2305.10926v1
- Date: Thu, 18 May 2023 12:38:40 GMT
- Title: HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal
Prototypes
- Authors: Aiden Durrant and Georgios Leontidis
- Abstract summary: We use hyperbolic representation space for self-supervised representation learning for prototype-based clustering approaches.
We extend the Masked Siamese Networks to operate on the Poincar'e ball model of hyperbolic space.
Unlike previous methods we project to the hyperbolic space at the output of the encoder network and utilise a hyperbolic projection head to ensure that the representations used for downstream tasks remain hyperbolic.
- Score: 7.665392786787577
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperbolic manifolds for visual representation learning allow for effective
learning of semantic class hierarchies by naturally embedding tree-like
structures with low distortion within a low-dimensional representation space.
The highly separable semantic class hierarchies produced by hyperbolic learning
have shown to be powerful in low-shot tasks, however, their application in
self-supervised learning is yet to be explored fully. In this work, we explore
the use of hyperbolic representation space for self-supervised representation
learning for prototype-based clustering approaches. First, we extend the Masked
Siamese Networks to operate on the Poincar\'e ball model of hyperbolic space,
secondly, we place prototypes on the ideal boundary of the Poincar\'e ball.
Unlike previous methods we project to the hyperbolic space at the output of the
encoder network and utilise a hyperbolic projection head to ensure that the
representations used for downstream tasks remain hyperbolic. Empirically we
demonstrate the ability of these methods to perform comparatively to Euclidean
methods in lower dimensions for linear evaluation tasks, whilst showing
improvements in extreme few-shot learning tasks.
Related papers
- Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Dynamic Hyperbolic Attention Network for Fine Hand-object Reconstruction [76.5549647815413]
We propose the first precise hand-object reconstruction method in hyperbolic space, namely Dynamic Hyperbolic Attention Network (DHANet)
Our method learns mesh features with rich geometry-image multi-modal information and models better hand-object interaction.
arXiv Detail & Related papers (2023-09-06T13:00:10Z) - Point Contrastive Prediction with Semantic Clustering for
Self-Supervised Learning on Point Cloud Videos [71.20376514273367]
We propose a unified point cloud video self-supervised learning framework for object-centric and scene-centric data.
Our method outperforms supervised counterparts on a wide range of downstream tasks.
arXiv Detail & Related papers (2023-08-18T02:17:47Z) - Hyperbolic Representation Learning: Revisiting and Advancing [43.1661098138936]
We introduce a position-tracking mechanism to scrutinize existing prevalent hlms, revealing that the learned representations are sub-optimal and unsatisfactory.
We propose a simple yet effective method, hyperbolic informed embedding (HIE), by incorporating cost-free hierarchical information deduced from the hyperbolic distance of the node to origin.
Our method achieves a remarkable improvement of up to 21.4% compared to the competing baselines.
arXiv Detail & Related papers (2023-06-15T13:25:39Z) - Enhancing Hyperbolic Graph Embeddings via Contrastive Learning [7.901082408569372]
We propose a novel Hyperbolic Graph Contrastive Learning (HGCL) framework which learns node representations through multiple hyperbolic spaces.
Experimental results on multiple real-world datasets demonstrate the superiority of the proposed HGCL.
arXiv Detail & Related papers (2022-01-21T06:10:05Z) - Hyperbolic Busemann Learning with Ideal Prototypes [14.525985704735055]
In this work, we propose Hyperbolic Busemann Learning for representation learning of arbitrary data.
To be able to compute proximities to ideal prototypes, we introduce the penalised Busemann loss.
Empirically, we show that our approach provides a natural interpretation of classification confidence, while outperforming recent hyperspherical and hyperbolic prototype approaches.
arXiv Detail & Related papers (2021-06-28T08:36:59Z) - Fully Hyperbolic Neural Networks [63.22521652077353]
We propose a fully hyperbolic framework to build hyperbolic networks based on the Lorentz model.
We show that our method has better performance for building both shallow and deep networks.
arXiv Detail & Related papers (2021-05-31T03:36:49Z) - Unsupervised Hyperbolic Representation Learning via Message Passing
Auto-Encoders [29.088604461911892]
In this paper, we analyze how unsupervised tasks can benefit from learned representations in hyperbolic space.
To explore how well the hierarchical structure of unlabeled data can be represented in hyperbolic spaces, we design a novel hyperbolic message passing auto-encoder.
The proposed model conducts auto-encoding the networks via fully utilizing hyperbolic geometry in message passing.
arXiv Detail & Related papers (2021-03-30T03:09:53Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.