Supervised Auto-Encoding Twin-Bottleneck Hashing
- URL: http://arxiv.org/abs/2306.11122v1
- Date: Mon, 19 Jun 2023 18:50:02 GMT
- Title: Supervised Auto-Encoding Twin-Bottleneck Hashing
- Authors: Yuan Chen, St\'ephane Marchand-Maillet
- Abstract summary: Auto-encoding Twin-bottleneck Hashing is one such method that dynamically builds the graph.
In this work, we generalize the original model into a supervised deep hashing network by incorporating the label information.
- Score: 5.653113092257149
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep hashing has shown to be a complexity-efficient solution for the
Approximate Nearest Neighbor search problem in high dimensional space. Many
methods usually build the loss function from pairwise or triplet data points to
capture the local similarity structure. Other existing methods construct the
similarity graph and consider all points simultaneously. Auto-encoding
Twin-bottleneck Hashing is one such method that dynamically builds the graph.
Specifically, each input data is encoded into a binary code and a continuous
variable, or the so-called twin bottlenecks. The similarity graph is then
computed from these binary codes, which get updated consistently during the
training. In this work, we generalize the original model into a supervised deep
hashing network by incorporating the label information. In addition, we examine
the differences of codes structure between these two networks and consider the
class imbalance problem especially in multi-labeled datasets. Experiments on
three datasets yield statistically significant improvement against the original
model. Results are also comparable and competitive to other supervised methods.
Related papers
- Ensemble Quadratic Assignment Network for Graph Matching [52.20001802006391]
Graph matching is a commonly used technique in computer vision and pattern recognition.
Recent data-driven approaches have improved the graph matching accuracy remarkably.
We propose a graph neural network (GNN) based approach to combine the advantages of data-driven and traditional methods.
arXiv Detail & Related papers (2024-03-11T06:34:05Z) - Graph-Collaborated Auto-Encoder Hashing for Multi-view Binary Clustering [11.082316688429641]
We propose a hashing algorithm based on auto-encoders for multi-view binary clustering.
Specifically, we propose a multi-view affinity graphs learning model with low-rank constraint, which can mine the underlying geometric information from multi-view data.
We also design an encoder-decoder paradigm to collaborate the multiple affinity graphs, which can learn a unified binary code effectively.
arXiv Detail & Related papers (2023-01-06T12:43:13Z) - One Loss for All: Deep Hashing with a Single Cosine Similarity based
Learning Objective [86.48094395282546]
A deep hashing model typically has two main learning objectives: to make the learned binary hash codes discriminative and to minimize a quantization error.
We propose a novel deep hashing model with only a single learning objective.
Our model is highly effective, outperforming the state-of-the-art multi-loss hashing models on three large-scale instance retrieval benchmarks.
arXiv Detail & Related papers (2021-09-29T14:27:51Z) - Rank-Consistency Deep Hashing for Scalable Multi-Label Image Search [90.30623718137244]
We propose a novel deep hashing method for scalable multi-label image search.
A new rank-consistency objective is applied to align the similarity orders from two spaces.
A powerful loss function is designed to penalize the samples whose semantic similarity and hamming distance are mismatched.
arXiv Detail & Related papers (2021-02-02T13:46:58Z) - Comprehensive Graph-conditional Similarity Preserving Network for
Unsupervised Cross-modal Hashing [97.44152794234405]
Unsupervised cross-modal hashing (UCMH) has become a hot topic recently.
In this paper, we devise a deep graph-neighbor coherence preserving network (DGCPN)
DGCPN regulates comprehensive similarity preserving losses by exploiting three types of data similarities.
arXiv Detail & Related papers (2020-12-25T07:40:59Z) - CIMON: Towards High-quality Hash Codes [63.37321228830102]
We propose a new method named textbfComprehensive stextbfImilarity textbfMining and ctextbfOnsistency leartextbfNing (CIMON)
First, we use global refinement and similarity statistical distribution to obtain reliable and smooth guidance. Second, both semantic and contrastive consistency learning are introduced to derive both disturb-invariant and discriminative hash codes.
arXiv Detail & Related papers (2020-10-15T14:47:14Z) - Self-Supervised Bernoulli Autoencoders for Semi-Supervised Hashing [1.8899300124593648]
This paper investigates the robustness of hashing methods based on variational autoencoders to the lack of supervision.
We propose a novel supervision method in which the model uses its label distribution predictions to implement the pairwise objective.
Our experiments show that both methods can significantly increase the hash codes' quality.
arXiv Detail & Related papers (2020-07-17T07:47:10Z) - Learning to hash with semantic similarity metrics and empirical KL
divergence [3.04585143845864]
Learning to hash is an efficient paradigm for exact and approximate nearest neighbor search from massive databases.
Binary hash codes are typically extracted from an image by rounding output features from a CNN, which is trained on a supervised binary similar/ dissimilar task.
We overcome (i) via a novel loss function encouraging the relative hash code distances of learned features to match those derived from their targets.
We address (ii) via a differentiable estimate of the KL divergence between network outputs and a binary target distribution, resulting in minimal information loss when the features are rounded to binary.
arXiv Detail & Related papers (2020-05-11T08:20:26Z) - Auto-Encoding Twin-Bottleneck Hashing [141.5378966676885]
This paper proposes an efficient and adaptive code-driven graph.
It is updated by decoding in the context of an auto-encoder.
Experiments on benchmarked datasets clearly show the superiority of our framework over the state-of-the-art hashing methods.
arXiv Detail & Related papers (2020-02-27T05:58:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.