Self-Distilled Hashing for Deep Image Retrieval
- URL: http://arxiv.org/abs/2112.08816v1
- Date: Thu, 16 Dec 2021 12:01:50 GMT
- Title: Self-Distilled Hashing for Deep Image Retrieval
- Authors: Young Kyun Jang, Geonmo Gu, Byungsoo Ko, and Nam Ik Cho
- Abstract summary: In hash-based image retrieval systems, transformed input from the original usually generates different codes.
We propose a novel self-distilled hashing scheme to minimize the discrepancy while exploiting the potential of augmented data.
We also introduce hash proxy-based similarity learning and binary cross entropy-based quantization loss to provide fine quality hash codes.
- Score: 25.645550298697938
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In hash-based image retrieval systems, the transformed input from the
original usually generates different codes, deteriorating the retrieval
accuracy. To mitigate this issue, data augmentation can be applied during
training. However, even if the augmented samples of one content are similar in
real space, the quantization can scatter them far away in Hamming space. This
results in representation discrepancies that can impede training and degrade
performance. In this work, we propose a novel self-distilled hashing scheme to
minimize the discrepancy while exploiting the potential of augmented data. By
transferring the hash knowledge of the weakly-transformed samples to the strong
ones, we make the hash code insensitive to various transformations. We also
introduce hash proxy-based similarity learning and binary cross entropy-based
quantization loss to provide fine quality hash codes. Ultimately, we construct
a deep hashing framework that generates discriminative hash codes. Extensive
experiments on benchmarks verify that our self-distillation improves the
existing deep hashing approaches, and our framework achieves state-of-the-art
retrieval results. The code will be released soon.
Related papers
- A Lower Bound of Hash Codes' Performance [122.88252443695492]
In this paper, we prove that inter-class distinctiveness and intra-class compactness among hash codes determine the lower bound of hash codes' performance.
We then propose a surrogate model to fully exploit the above objective by estimating the posterior of hash codes and controlling it, which results in a low-bias optimization.
By testing on a series of hash-models, we obtain performance improvements among all of them, with an up to $26.5%$ increase in mean Average Precision and an up to $20.5%$ increase in accuracy.
arXiv Detail & Related papers (2022-10-12T03:30:56Z) - Learning to Hash Naturally Sorts [84.90210592082829]
We introduce Naturally-Sorted Hashing (NSH) to train a deep hashing model with sorted results end-to-end.
NSH sort the Hamming distances of samples' hash codes and accordingly gather their latent representations for self-supervised training.
We describe a novel Sorted Noise-Contrastive Estimation (SortedNCE) loss that selectively picks positive and negative samples for contrastive learning.
arXiv Detail & Related papers (2022-01-31T16:19:02Z) - Hard Example Guided Hashing for Image Retrieval [3.606866431185676]
It exists two main factors affecting the ability of learning hard examples, which are weak key features extraction and the shortage of hard examples.
In this paper, we give a novel end-to-end model to extract the key feature from hard examples and obtain hash code with the accurate semantic information.
Experimental results on CIFAR-10 and NUS-WIDE demonstrate that our model outperformances the mainstream hashing-based image retrieval methods.
arXiv Detail & Related papers (2021-12-27T08:24:10Z) - Unsupervised Multi-Index Semantic Hashing [23.169142004594434]
We propose an unsupervised hashing model that learns hash codes that are both effective and highly efficient by being optimized for multi-index hashing.
We experimentally compare MISH to state-of-the-art semantic hashing baselines in the task of document similarity search.
We find that even though multi-index hashing also improves the efficiency of the baselines compared to a linear scan, they are still upwards of 33% slower than MISH.
arXiv Detail & Related papers (2021-03-26T13:33:48Z) - CIMON: Towards High-quality Hash Codes [63.37321228830102]
We propose a new method named textbfComprehensive stextbfImilarity textbfMining and ctextbfOnsistency leartextbfNing (CIMON)
First, we use global refinement and similarity statistical distribution to obtain reliable and smooth guidance. Second, both semantic and contrastive consistency learning are introduced to derive both disturb-invariant and discriminative hash codes.
arXiv Detail & Related papers (2020-10-15T14:47:14Z) - Deep Reinforcement Learning with Label Embedding Reward for Supervised
Image Hashing [85.84690941656528]
We introduce a novel decision-making approach for deep supervised hashing.
We learn a deep Q-network with a novel label embedding reward defined by Bose-Chaudhuri-Hocquenghem codes.
Our approach outperforms state-of-the-art supervised hashing methods under various code lengths.
arXiv Detail & Related papers (2020-08-10T09:17:20Z) - Dual-level Semantic Transfer Deep Hashing for Efficient Social Image
Retrieval [35.78137004253608]
Social network stores and disseminates a tremendous amount of user shared images.
Deep hashing is an efficient indexing technique to support large-scale social image retrieval.
Existing methods suffer from severe semantic shortage when optimizing a large amount of deep neural network parameters.
We propose a Dual-level Semantic Transfer Deep Hashing (DSTDH) method to alleviate this problem.
arXiv Detail & Related papers (2020-06-10T01:03:09Z) - Pairwise Supervised Hashing with Bernoulli Variational Auto-Encoder and
Self-Control Gradient Estimator [62.26981903551382]
Variational auto-encoders (VAEs) with binary latent variables provide state-of-the-art performance in terms of precision for document retrieval.
We propose a pairwise loss function with discrete latent VAE to reward within-class similarity and between-class dissimilarity for supervised hashing.
This new semantic hashing framework achieves superior performance compared to the state-of-the-arts.
arXiv Detail & Related papers (2020-05-21T06:11:33Z) - Reinforcing Short-Length Hashing [61.75883795807109]
Existing methods have poor performance in retrieval using an extremely short-length hash code.
In this study, we propose a novel reinforcing short-length hashing (RSLH)
In this proposed RSLH, mutual reconstruction between the hash representation and semantic labels is performed to preserve the semantic information.
Experiments on three large-scale image benchmarks demonstrate the superior performance of RSLH under various short-length hashing scenarios.
arXiv Detail & Related papers (2020-04-24T02:23:52Z) - Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance [12.968141477410597]
Adversarial autoencoders are shown to be able to implicitly learn a robust, locality-preserving hash function that generates balanced and high-quality hash codes.
The existing adversarial hashing methods are inefficient to be employed for large-scale image retrieval applications.
We propose a new adversarial-autoencoder hashing approach that has a much lower sample requirement and computational cost.
arXiv Detail & Related papers (2020-02-29T00:22:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.