Binary Representation via Jointly Personalized Sparse Hashing
- URL: http://arxiv.org/abs/2208.14883v1
- Date: Wed, 31 Aug 2022 14:18:37 GMT
- Title: Binary Representation via Jointly Personalized Sparse Hashing
- Authors: Xiaoqin Wang, Chen Chen, Rushi Lan, Licheng Liu, Zhenbing Liu, Huiyu
Zhou and Xiaonan Luo
- Abstract summary: We propose an effective unsupervised method, namely Jointly Personalized Sparse Hashing (JPSH) for binary representation learning.
Different personalized subspaces are constructed to reflect category-specific attributes for different clusters.
To simultaneously preserve semantic and pairwise similarities in our JPSH, we incorporate the PSH and manifold-based hash learning into the seamless formulation.
- Score: 22.296464665032588
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised hashing has attracted much attention for binary representation
learning due to the requirement of economical storage and efficiency of binary
codes. It aims to encode high-dimensional features in the Hamming space with
similarity preservation between instances. However, most existing methods learn
hash functions in manifold-based approaches. Those methods capture the local
geometric structures (i.e., pairwise relationships) of data, and lack
satisfactory performance in dealing with real-world scenarios that produce
similar features (e.g. color and shape) with different semantic information. To
address this challenge, in this work, we propose an effective unsupervised
method, namely Jointly Personalized Sparse Hashing (JPSH), for binary
representation learning. To be specific, firstly, we propose a novel
personalized hashing module, i.e., Personalized Sparse Hashing (PSH). Different
personalized subspaces are constructed to reflect category-specific attributes
for different clusters, adaptively mapping instances within the same cluster to
the same Hamming space. In addition, we deploy sparse constraints for different
personalized subspaces to select important features. We also collect the
strengths of the other clusters to build the PSH module with avoiding
over-fitting. Then, to simultaneously preserve semantic and pairwise
similarities in our JPSH, we incorporate the PSH and manifold-based hash
learning into the seamless formulation. As such, JPSH not only distinguishes
the instances from different clusters, but also preserves local neighborhood
structures within the cluster. Finally, an alternating optimization algorithm
is adopted to iteratively capture analytical solutions of the JPSH model.
Extensive experiments on four benchmark datasets verify that the JPSH
outperforms several hashing algorithms on the similarity search task.
Related papers
- Prototypical Hash Encoding for On-the-Fly Fine-Grained Category Discovery [65.16724941038052]
Category-aware Prototype Generation (CPG) and Discrimi Category 5.3% (DCE) are proposed.
CPG enables the model to fully capture the intra-category diversity by representing each category with multiple prototypes.
DCE boosts the discrimination ability of hash code with the guidance of the generated category prototypes.
arXiv Detail & Related papers (2024-10-24T23:51:40Z) - Cluster-Aware Similarity Diffusion for Instance Retrieval [64.40171728912702]
Diffusion-based re-ranking is a common method used for retrieving instances by performing similarity propagation in a nearest neighbor graph.
We propose a novel Cluster-Aware Similarity (CAS) diffusion for instance retrieval.
arXiv Detail & Related papers (2024-06-04T14:19:50Z) - Attributes Grouping and Mining Hashing for Fine-Grained Image Retrieval [24.8065557159198]
We propose an Attributes Grouping and Mining Hashing (AGMH) for fine-grained image retrieval.
AGMH groups and embeds the category-specific visual attributes in multiple descriptors to generate a comprehensive feature representation.
AGMH consistently yields the best performance against state-of-the-art methods on fine-grained benchmark datasets.
arXiv Detail & Related papers (2023-11-10T14:01:56Z) - Sparse-Inductive Generative Adversarial Hashing for Nearest Neighbor
Search [8.020530603813416]
We propose a novel unsupervised hashing method, termed Sparsity-Induced Generative Adversarial Hashing (SiGAH)
SiGAH encodes large-scale high-scale high-dimensional features into binary codes, which solves the two problems through a generative adversarial training framework.
Experimental results on four benchmarks, i.e. Tiny100K, GIST1M, Deep1M, and MNIST, have shown that the proposed SiGAH has superior performance over state-of-the-art approaches.
arXiv Detail & Related papers (2023-06-12T08:07:23Z) - Unsupervised Hashing with Similarity Distribution Calibration [127.34239817201549]
Unsupervised hashing methods aim to preserve the similarity between data points in a feature space by mapping them to binary hash codes.
These methods often overlook the fact that the similarity between data points in the continuous feature space may not be preserved in the discrete hash code space.
The similarity range is bounded by the code length and can lead to a problem known as similarity collapse.
This paper introduces a novel Similarity Distribution (SDC) method to alleviate this problem.
arXiv Detail & Related papers (2023-02-15T14:06:39Z) - Index $t$-SNE: Tracking Dynamics of High-Dimensional Datasets with
Coherent Embeddings [1.7188280334580195]
This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved.
The proposed algorithm has the same complexity as the original $t$-SNE to embed new items, and a lower one when considering the embedding of a dataset sliced into sub-pieces.
arXiv Detail & Related papers (2021-09-22T06:45:37Z) - CIMON: Towards High-quality Hash Codes [63.37321228830102]
We propose a new method named textbfComprehensive stextbfImilarity textbfMining and ctextbfOnsistency leartextbfNing (CIMON)
First, we use global refinement and similarity statistical distribution to obtain reliable and smooth guidance. Second, both semantic and contrastive consistency learning are introduced to derive both disturb-invariant and discriminative hash codes.
arXiv Detail & Related papers (2020-10-15T14:47:14Z) - Deep Hashing with Hash-Consistent Large Margin Proxy Embeddings [65.36757931982469]
Image hash codes are produced by binarizing embeddings of convolutional neural networks (CNN) trained for either classification or retrieval.
The use of a fixed set of proxies (weights of the CNN classification layer) is proposed to eliminate this ambiguity.
The resulting hash-consistent large margin (HCLM) proxies are shown to encourage saturation of hashing units, thus guaranteeing a small binarization error.
arXiv Detail & Related papers (2020-07-27T23:47:43Z) - Procrustean Orthogonal Sparse Hashing [3.302605292858623]
We show that insect olfaction is structurally and functionally analogous to sparse hashing.
We present a novel method, Procrustean Orthogonal Sparse Hashing (POSH), that unifies these findings.
We propose two new methods, Binary OSL and SphericalHash, to address these deficiencies.
arXiv Detail & Related papers (2020-06-08T18:09:33Z) - Reinforcing Short-Length Hashing [61.75883795807109]
Existing methods have poor performance in retrieval using an extremely short-length hash code.
In this study, we propose a novel reinforcing short-length hashing (RSLH)
In this proposed RSLH, mutual reconstruction between the hash representation and semantic labels is performed to preserve the semantic information.
Experiments on three large-scale image benchmarks demonstrate the superior performance of RSLH under various short-length hashing scenarios.
arXiv Detail & Related papers (2020-04-24T02:23:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.