Asymmetric Hash Code Learning for Remote Sensing Image Retrieval
- URL: http://arxiv.org/abs/2201.05772v1
- Date: Sat, 15 Jan 2022 07:00:38 GMT
- Title: Asymmetric Hash Code Learning for Remote Sensing Image Retrieval
- Authors: Weiwei Song, Zhi Gao, Renwei Dian, Pedram Ghamisi, Yongjun Zhang, and
J\'on Atli Benediktsson
- Abstract summary: We propose a novel deep hashing method, named asymmetric hash code learning (AHCL), for remote sensing image retrieval.
The AHCL generates the hash codes of query and database images in an asymmetric way.
The experimental results on three public datasets demonstrate that the proposed method outperforms symmetric methods in terms of retrieval accuracy and efficiency.
- Score: 22.91678927865952
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Remote sensing image retrieval (RSIR), aiming at searching for a set of
similar items to a given query image, is a very important task in remote
sensing applications. Deep hashing learning as the current mainstream method
has achieved satisfactory retrieval performance. On one hand, various deep
neural networks are used to extract semantic features of remote sensing images.
On the other hand, the hashing techniques are subsequently adopted to map the
high-dimensional deep features to the low-dimensional binary codes. This kind
of methods attempts to learn one hash function for both the query and database
samples in a symmetric way. However, with the number of database samples
increasing, it is typically time-consuming to generate the hash codes of
large-scale database images. In this paper, we propose a novel deep hashing
method, named asymmetric hash code learning (AHCL), for RSIR. The proposed AHCL
generates the hash codes of query and database images in an asymmetric way. In
more detail, the hash codes of query images are obtained by binarizing the
output of the network, while the hash codes of database images are directly
learned by solving the designed objective function. In addition, we combine the
semantic information of each image and the similarity information of pairs of
images as supervised information to train a deep hashing network, which
improves the representation ability of deep features and hash codes. The
experimental results on three public datasets demonstrate that the proposed
method outperforms symmetric methods in terms of retrieval accuracy and
efficiency. The source code is available at
https://github.com/weiweisong415/Demo AHCL for TGRS2022.
Related papers
- HybridHash: Hybrid Convolutional and Self-Attention Deep Hashing for Image Retrieval [0.3880517371454968]
We propose a hybrid convolutional and self-attention deep hashing method known as HybridHash.
We have conducted comprehensive experiments on three widely used datasets: CIFAR-10, NUS-WIDE and IMAGENET.
The experimental results demonstrate that the method proposed in this paper has superior performance with respect to state-of-the-art deep hashing methods.
arXiv Detail & Related papers (2024-05-13T07:45:20Z) - ElasticHash: Semantic Image Similarity Search by Deep Hashing with
Elasticsearch [0.9167082845109439]
ElasticHash is a novel approach for high-quality, efficient, and large-scale semantic image similarity search.
It is based on a deep hashing model to learn hash codes for fine-grained image similarity search in natural images.
arXiv Detail & Related papers (2023-05-08T13:50:47Z) - PHPQ: Pyramid Hybrid Pooling Quantization for Efficient Fine-Grained
Image Retrieval [68.05570413133462]
We propose a Pyramid Hybrid Pooling Quantization (PHPQ) module to capture and preserve fine-grained semantic information from multi-level features.
Experiments on two widely-used public benchmarks, CUB-200-2011 and Stanford Dogs, demonstrate that PHPQ outperforms state-of-the-art methods.
arXiv Detail & Related papers (2021-09-11T07:21:02Z) - Instance-weighted Central Similarity for Multi-label Image Retrieval [66.23348499938278]
We propose Instance-weighted Central Similarity (ICS) to automatically learn the center weight corresponding to a hash code.
Our method achieves the state-of-the-art performance on the image retrieval benchmarks, and especially improves the mAP by 1.6%-6.4% on the MS COCO dataset.
arXiv Detail & Related papers (2021-08-11T15:18:18Z) - Deep Reinforcement Learning with Label Embedding Reward for Supervised
Image Hashing [85.84690941656528]
We introduce a novel decision-making approach for deep supervised hashing.
We learn a deep Q-network with a novel label embedding reward defined by Bose-Chaudhuri-Hocquenghem codes.
Our approach outperforms state-of-the-art supervised hashing methods under various code lengths.
arXiv Detail & Related papers (2020-08-10T09:17:20Z) - Multiple Code Hashing for Efficient Image Retrieval [16.750400008178293]
We propose a novel hashing framework, called multiple code hashing (MCH) to improve the performance of hash bucket search.
MCH is to learn multiple hash codes for each image, with each code representing a different region of the image.
To the best of our knowledge, this is the first work that proposes to learn multiple hash codes for each image in image retrieval.
arXiv Detail & Related papers (2020-08-04T13:18:19Z) - A survey on deep hashing for image retrieval [7.156209824590489]
I propose a Shadow Recurrent Hashing(SRH) method as a try to break through the bottleneck of existing hashing methods.
Specifically, I devise a CNN architecture to extract the semantic features of images and design a loss function to encourage similar images projected close.
Several experiments on dataset CIFAR-10 show the satisfying performance of SRH.
arXiv Detail & Related papers (2020-06-10T03:01:59Z) - Dual-level Semantic Transfer Deep Hashing for Efficient Social Image
Retrieval [35.78137004253608]
Social network stores and disseminates a tremendous amount of user shared images.
Deep hashing is an efficient indexing technique to support large-scale social image retrieval.
Existing methods suffer from severe semantic shortage when optimizing a large amount of deep neural network parameters.
We propose a Dual-level Semantic Transfer Deep Hashing (DSTDH) method to alleviate this problem.
arXiv Detail & Related papers (2020-06-10T01:03:09Z) - Reinforcing Short-Length Hashing [61.75883795807109]
Existing methods have poor performance in retrieval using an extremely short-length hash code.
In this study, we propose a novel reinforcing short-length hashing (RSLH)
In this proposed RSLH, mutual reconstruction between the hash representation and semantic labels is performed to preserve the semantic information.
Experiments on three large-scale image benchmarks demonstrate the superior performance of RSLH under various short-length hashing scenarios.
arXiv Detail & Related papers (2020-04-24T02:23:52Z) - Targeted Attack for Deep Hashing based Retrieval [57.582221494035856]
We propose a novel method, dubbed deep hashing targeted attack (DHTA), to study the targeted attack on such retrieval.
We first formulate the targeted attack as a point-to-set optimization, which minimizes the average distance between the hash code of an adversarial example and those of a set of objects with the target label.
To balance the performance and perceptibility, we propose to minimize the Hamming distance between the hash code of the adversarial example and the anchor code under the $ellinfty$ restriction on the perturbation.
arXiv Detail & Related papers (2020-04-15T08:36:58Z) - A Survey on Deep Hashing Methods [52.326472103233854]
Nearest neighbor search aims to obtain the samples in the database with the smallest distances from them to the queries.
With the development of deep learning, deep hashing methods show more advantages than traditional methods.
Deep supervised hashing is categorized into pairwise methods, ranking-based methods, pointwise methods and quantization.
Deep unsupervised hashing is categorized into similarity reconstruction-based methods, pseudo-label-based methods and prediction-free self-supervised learning-based methods.
arXiv Detail & Related papers (2020-03-04T08:25:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.