Deep Unsupervised Hashing by Distilled Smooth Guidance
- URL: http://arxiv.org/abs/2105.06125v1
- Date: Thu, 13 May 2021 07:59:57 GMT
- Title: Deep Unsupervised Hashing by Distilled Smooth Guidance
- Authors: Xiao Luo, Zeyu Ma, Daqing Wu, Huasong Zhong, Chong Chen, Jinwen Ma,
Minghua Deng
- Abstract summary: We propose a novel deep unsupervised hashing method, namely Distilled Smooth Guidance (DSG)
To be specific, we obtain the similarity confidence weights based on the initial noisy similarity signals learned from local structures.
Extensive experiments on three widely used benchmark datasets show that the proposed DSG consistently outperforms the state-of-the-art search methods.
- Score: 13.101031440853843
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hashing has been widely used in approximate nearest neighbor search for its
storage and computational efficiency. Deep supervised hashing methods are not
widely used because of the lack of labeled data, especially when the domain is
transferred. Meanwhile, unsupervised deep hashing models can hardly achieve
satisfactory performance due to the lack of reliable similarity signals. To
tackle this problem, we propose a novel deep unsupervised hashing method,
namely Distilled Smooth Guidance (DSG), which can learn a distilled dataset
consisting of similarity signals as well as smooth confidence signals. To be
specific, we obtain the similarity confidence weights based on the initial
noisy similarity signals learned from local structures and construct a priority
loss function for smooth similarity-preserving learning. Besides, global
information based on clustering is utilized to distill the image pairs by
removing contradictory similarity signals. Extensive experiments on three
widely used benchmark datasets show that the proposed DSG consistently
outperforms the state-of-the-art search methods.
Related papers
- Bit-mask Robust Contrastive Knowledge Distillation for Unsupervised
Semantic Hashing [71.47723696190184]
We propose an innovative Bit-mask Robust Contrastive knowledge Distillation (BRCD) method for semantic hashing.
BRCD is specifically devised for the distillation of semantic hashing models.
arXiv Detail & Related papers (2024-03-10T03:33:59Z) - Unsupervised Hashing with Similarity Distribution Calibration [127.34239817201549]
Unsupervised hashing methods aim to preserve the similarity between data points in a feature space by mapping them to binary hash codes.
These methods often overlook the fact that the similarity between data points in the continuous feature space may not be preserved in the discrete hash code space.
The similarity range is bounded by the code length and can lead to a problem known as similarity collapse.
This paper introduces a novel Similarity Distribution (SDC) method to alleviate this problem.
arXiv Detail & Related papers (2023-02-15T14:06:39Z) - Deep Asymmetric Hashing with Dual Semantic Regression and Class
Structure Quantization [9.539842235137376]
We propose a dual semantic asymmetric hashing (DSAH) method, which generates discriminative hash codes under three-fold constrains.
With these three main components, high-quality hash codes can be generated through network.
arXiv Detail & Related papers (2021-10-24T16:14:36Z) - STRONG: Synchronous and asynchronous RObust Network localization, under
Non-Gaussian noise [0.0]
Real-world network applications must cope with failing nodes, malicious attacks and data classified as outliers.
Our work addresses these concerns in the scope of the sensor network localization algorithms.
A major highlight of our contribution lies on the fact that we pay no price for provable distributed neither in accuracy, nor in communication cost or speed.
arXiv Detail & Related papers (2021-10-01T18:01:28Z) - Pseudo-supervised Deep Subspace Clustering [27.139553299302754]
Auto-Encoder (AE)-based deep subspace clustering (DSC) methods have achieved impressive performance.
However, self-reconstruction loss of an AE ignores rich useful relation information.
It is also challenging to learn high-level similarity without feeding semantic labels.
arXiv Detail & Related papers (2021-04-08T06:25:47Z) - Comprehensive Graph-conditional Similarity Preserving Network for
Unsupervised Cross-modal Hashing [97.44152794234405]
Unsupervised cross-modal hashing (UCMH) has become a hot topic recently.
In this paper, we devise a deep graph-neighbor coherence preserving network (DGCPN)
DGCPN regulates comprehensive similarity preserving losses by exploiting three types of data similarities.
arXiv Detail & Related papers (2020-12-25T07:40:59Z) - CIMON: Towards High-quality Hash Codes [63.37321228830102]
We propose a new method named textbfComprehensive stextbfImilarity textbfMining and ctextbfOnsistency leartextbfNing (CIMON)
First, we use global refinement and similarity statistical distribution to obtain reliable and smooth guidance. Second, both semantic and contrastive consistency learning are introduced to derive both disturb-invariant and discriminative hash codes.
arXiv Detail & Related papers (2020-10-15T14:47:14Z) - Pairwise Supervised Hashing with Bernoulli Variational Auto-Encoder and
Self-Control Gradient Estimator [62.26981903551382]
Variational auto-encoders (VAEs) with binary latent variables provide state-of-the-art performance in terms of precision for document retrieval.
We propose a pairwise loss function with discrete latent VAE to reward within-class similarity and between-class dissimilarity for supervised hashing.
This new semantic hashing framework achieves superior performance compared to the state-of-the-arts.
arXiv Detail & Related papers (2020-05-21T06:11:33Z) - Reinforcing Short-Length Hashing [61.75883795807109]
Existing methods have poor performance in retrieval using an extremely short-length hash code.
In this study, we propose a novel reinforcing short-length hashing (RSLH)
In this proposed RSLH, mutual reconstruction between the hash representation and semantic labels is performed to preserve the semantic information.
Experiments on three large-scale image benchmarks demonstrate the superior performance of RSLH under various short-length hashing scenarios.
arXiv Detail & Related papers (2020-04-24T02:23:52Z) - Deep Robust Multilevel Semantic Cross-Modal Hashing [25.895586911858857]
Hashing based cross-modal retrieval has recently made significant progress.
But straightforward embedding data from different modalities into a joint Hamming space will inevitably produce false codes.
We present a novel Robust Multilevel Semantic Hashing (RMSH) for more accurate cross-modal retrieval.
arXiv Detail & Related papers (2020-02-07T10:08:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.