Hyperbolic Hierarchical Contrastive Hashing
- URL: http://arxiv.org/abs/2212.08904v1
- Date: Sat, 17 Dec 2022 15:34:55 GMT
- Title: Hyperbolic Hierarchical Contrastive Hashing
- Authors: Rukai Wei, Yu Liu, Jingkuan Song, Yanzhao Xie and Ke Zhou
- Abstract summary: We propose a novel unsupervised hashing method named Hyperbolic Hierarchical Contrastive Hashing (HHCH)
We embed continuous hash codes into hyperbolic space for accurate semantic expression.
In addition, we extend the K-Means algorithm to hyperbolic space and perform the proposed hierarchical hyperbolic K-Means algorithm to construct hierarchical semantic structures adaptively.
- Score: 41.06974763117755
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hierarchical semantic structures, naturally existing in real-world datasets,
can assist in capturing the latent distribution of data to learn robust hash
codes for retrieval systems. Although hierarchical semantic structures can be
simply expressed by integrating semantically relevant data into a high-level
taxon with coarser-grained semantics, the construction, embedding, and
exploitation of the structures remain tricky for unsupervised hash learning. To
tackle these problems, we propose a novel unsupervised hashing method named
Hyperbolic Hierarchical Contrastive Hashing (HHCH). We propose to embed
continuous hash codes into hyperbolic space for accurate semantic expression
since embedding hierarchies in hyperbolic space generates less distortion than
in hyper-sphere space and Euclidean space. In addition, we extend the K-Means
algorithm to hyperbolic space and perform the proposed hierarchical hyperbolic
K-Means algorithm to construct hierarchical semantic structures adaptively. To
exploit the hierarchical semantic structures in hyperbolic space, we designed
the hierarchical contrastive learning algorithm, including hierarchical
instance-wise and hierarchical prototype-wise contrastive learning. Extensive
experiments on four benchmark datasets demonstrate that the proposed method
outperforms the state-of-the-art unsupervised hashing methods. Codes will be
released.
Related papers
- Exploiting Data Hierarchy as a New Modality for Contrastive Learning [0.0]
This work investigates how hierarchically structured data can help neural networks learn conceptual representations of cathedrals.
The underlying WikiScenes dataset provides a spatially organized hierarchical structure of cathedral components.
We propose a novel hierarchical contrastive training approach that leverages a triplet margin loss to represent the data's spatial hierarchy in the encoder's latent space.
arXiv Detail & Related papers (2024-01-06T21:47:49Z) - HIER: Metric Learning Beyond Class Labels via Hierarchical
Regularization [17.575016642108253]
We propose a new regularization method, dubbed HIER, to discover the latent semantic hierarchy of training data.
HIER achieves this goal with no annotation for the semantic hierarchy but by learning hierarchical proxies in hyperbolic spaces.
arXiv Detail & Related papers (2022-12-29T11:05:47Z) - HyperMiner: Topic Taxonomy Mining with Hyperbolic Embedding [54.52651110749165]
We present a novel framework that introduces hyperbolic embeddings to represent words and topics.
With the tree-likeness property of hyperbolic space, the underlying semantic hierarchy can be better exploited to mine more interpretable topics.
arXiv Detail & Related papers (2022-10-16T02:54:17Z) - Weakly-supervised Action Localization via Hierarchical Mining [76.00021423700497]
Weakly-supervised action localization aims to localize and classify action instances in the given videos temporally with only video-level categorical labels.
We propose a hierarchical mining strategy under video-level and snippet-level manners, i.e., hierarchical supervision and hierarchical consistency mining.
We show that HiM-Net outperforms existing methods on THUMOS14 and ActivityNet1.3 datasets with large margins by hierarchically mining the supervision and consistency.
arXiv Detail & Related papers (2022-06-22T12:19:09Z) - Self-supervised asymmetric deep hashing with margin-scalable constraint
for image retrieval [3.611160663701664]
We propose a novel self-supervised asymmetric deep hashing method with a margin-scalable constraint(SADH) approach for image retrieval.
SADH implements a self-supervised network to preserve semantic information in a semantic feature map and a semantic code map for the semantics of the given dataset.
For the feature learning part, a new margin-scalable constraint is employed for both highly-accurate construction of pairwise correlations in the hamming space and a more discriminative hash code representation.
arXiv Detail & Related papers (2020-12-07T16:09:37Z) - Unsupervised Embedding of Hierarchical Structure in Euclidean Space [30.507049058838025]
We consider learning a non-linear embedding of data into Euclidean space as a way to improve the hierarchical clustering produced by agglomerative algorithms.
We show that rescaling the latent space embedding leads to improved results for both dendrogram purity and the Moseley-Wang cost function.
arXiv Detail & Related papers (2020-10-30T03:57:09Z) - CIMON: Towards High-quality Hash Codes [63.37321228830102]
We propose a new method named textbfComprehensive stextbfImilarity textbfMining and ctextbfOnsistency leartextbfNing (CIMON)
First, we use global refinement and similarity statistical distribution to obtain reliable and smooth guidance. Second, both semantic and contrastive consistency learning are introduced to derive both disturb-invariant and discriminative hash codes.
arXiv Detail & Related papers (2020-10-15T14:47:14Z) - Deep Hashing with Hash-Consistent Large Margin Proxy Embeddings [65.36757931982469]
Image hash codes are produced by binarizing embeddings of convolutional neural networks (CNN) trained for either classification or retrieval.
The use of a fixed set of proxies (weights of the CNN classification layer) is proposed to eliminate this ambiguity.
The resulting hash-consistent large margin (HCLM) proxies are shown to encourage saturation of hashing units, thus guaranteeing a small binarization error.
arXiv Detail & Related papers (2020-07-27T23:47:43Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z) - Data Structures & Algorithms for Exact Inference in Hierarchical
Clustering [41.24805506595378]
We present novel dynamic-programming algorithms for emphexact inference in hierarchical clustering based on a novel trellis data structure.
Our algorithms scale in time and space proportional to the powerset of $N$ elements which is super-exponentially more efficient than explicitly considering each of the (2N-3)!! possible hierarchies.
arXiv Detail & Related papers (2020-02-26T17:43:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.