Hyperbolic Space with Hierarchical Margin Boosts Fine-Grained Learning
from Coarse Labels
- URL: http://arxiv.org/abs/2311.11019v1
- Date: Sat, 18 Nov 2023 09:42:03 GMT
- Title: Hyperbolic Space with Hierarchical Margin Boosts Fine-Grained Learning
from Coarse Labels
- Authors: Shu-Lin Xu and Yifan Sun and Faen Zhang and Anqi Xu and Xiu-Shen Wei
and Yi Yang
- Abstract summary: We propose a method that embeds visual embeddings into a hyperbolic space and enhances their discriminative ability with a hierarchical cosine margins manner.
Specifically, the hyperbolic space offers distinct advantages, including the ability to capture hierarchical relationships.
Based on the hyperbolic space, we further enforce relatively large/small similarity margins between coarse/fine classes, respectively, yielding the so-called hierarchical cosine margins manner.
- Score: 43.3561344548331
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning fine-grained embeddings from coarse labels is a challenging task due
to limited label granularity supervision, i.e., lacking the detailed
distinctions required for fine-grained tasks. The task becomes even more
demanding when attempting few-shot fine-grained recognition, which holds
practical significance in various applications. To address these challenges, we
propose a novel method that embeds visual embeddings into a hyperbolic space
and enhances their discriminative ability with a hierarchical cosine margins
manner. Specifically, the hyperbolic space offers distinct advantages,
including the ability to capture hierarchical relationships and increased
expressive power, which favors modeling fine-grained objects. Based on the
hyperbolic space, we further enforce relatively large/small similarity margins
between coarse/fine classes, respectively, yielding the so-called hierarchical
cosine margins manner. While enforcing similarity margins in the regular
Euclidean space has become popular for deep embedding learning, applying it to
the hyperbolic space is non-trivial and validating the benefit for
coarse-to-fine generalization is valuable. Extensive experiments conducted on
five benchmark datasets showcase the effectiveness of our proposed method,
yielding state-of-the-art results surpassing competing methods.
Related papers
- Alignment and Outer Shell Isotropy for Hyperbolic Graph Contrastive
Learning [69.6810940330906]
We propose a novel contrastive learning framework to learn high-quality graph embedding.
Specifically, we design the alignment metric that effectively captures the hierarchical data-invariant information.
We show that in the hyperbolic space one has to address the leaf- and height-level uniformity which are related to properties of trees.
arXiv Detail & Related papers (2023-10-27T15:31:42Z) - Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Learning Deep Optimal Embeddings with Sinkhorn Divergences [33.496926214655666]
Deep Metric Learning algorithms aim to learn an efficient embedding space to preserve the similarity relationships among the input data.
These algorithms have achieved significant performance gains across a wide plethora of tasks, but fail to consider and increase comprehensive similarity constraints.
Here, we address the concern of learning a discriminative deep embedding space by designing a novel, yet effective Deep Class-wise Discrepancy Loss function.
arXiv Detail & Related papers (2022-09-14T07:54:16Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - Rank-Consistency Deep Hashing for Scalable Multi-Label Image Search [90.30623718137244]
We propose a novel deep hashing method for scalable multi-label image search.
A new rank-consistency objective is applied to align the similarity orders from two spaces.
A powerful loss function is designed to penalize the samples whose semantic similarity and hamming distance are mismatched.
arXiv Detail & Related papers (2021-02-02T13:46:58Z) - Towards Cross-Granularity Few-Shot Learning: Coarse-to-Fine
Pseudo-Labeling with Visual-Semantic Meta-Embedding [13.063136901934865]
Few-shot learning aims at rapidly adapting to novel categories with only a handful of samples at test time.
In this paper, we advance the few-shot classification paradigm towards a more challenging scenario, i.e., cross-granularity few-shot classification.
We approximate the fine-grained data distribution by greedy clustering of each coarse-class into pseudo-fine-classes according to the similarity of image embeddings.
arXiv Detail & Related papers (2020-07-11T03:44:21Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.