Unsupervised Domain-adaptive Hash for Networks
- URL: http://arxiv.org/abs/2108.09136v1
- Date: Fri, 20 Aug 2021 12:09:38 GMT
- Title: Unsupervised Domain-adaptive Hash for Networks
- Authors: Tao He, Lianli Gao, Jingkuan Song, Yuan-Fang Li
- Abstract summary: Domain-adaptive hash learning has enjoyed considerable success in the computer vision community.
We develop an unsupervised domain-adaptive hash learning method for networks, dubbed UDAH.
- Score: 81.49184987430333
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Abundant real-world data can be naturally represented by large-scale
networks, which demands efficient and effective learning algorithms. At the
same time, labels may only be available for some networks, which demands these
algorithms to be able to adapt to unlabeled networks. Domain-adaptive hash
learning has enjoyed considerable success in the computer vision community in
many practical tasks due to its lower cost in both retrieval time and storage
footprint. However, it has not been applied to multiple-domain networks. In
this work, we bridge this gap by developing an unsupervised domain-adaptive
hash learning method for networks, dubbed UDAH. Specifically, we develop four
{task-specific yet correlated} components: (1) network structure preservation
via a hard groupwise contrastive loss, (2) relaxation-free supervised hashing,
(3) cross-domain intersected discriminators, and (4) semantic center alignment.
We conduct a wide range of experiments to evaluate the effectiveness and
efficiency of our method on a range of tasks including link prediction, node
classification, and neighbor recommendation. Our evaluation results demonstrate
that our model achieves better performance than the state-of-the-art
conventional discrete embedding methods over all the tasks.
Related papers
- Densely Decoded Networks with Adaptive Deep Supervision for Medical
Image Segmentation [19.302294715542175]
We propose densely decoded networks (ddn), by selectively introducing 'crutch' network connections.
Such 'crutch' connections in each upsampling stage of the network decoder enhance target localization.
We also present a training strategy based on adaptive deep supervision (ads), which exploits and adapts specific attributes of input dataset.
arXiv Detail & Related papers (2024-02-05T00:44:57Z) - Evaluating the Label Efficiency of Contrastive Self-Supervised Learning
for Multi-Resolution Satellite Imagery [0.0]
Self-supervised learning has been applied in the remote sensing domain to exploit readily-available unlabeled data.
In this paper, we study self-supervised visual representation learning through the lens of label efficiency.
arXiv Detail & Related papers (2022-10-13T06:54:13Z) - Learning Consistency from High-quality Pseudo-labels for Weakly
Supervised Object Localization [7.602783618330373]
We propose a two-stage approach to learn more consistent localization.
In the first stage, we propose a mask-based pseudo label generator algorithm, and use the pseudo-supervised learning method to initialize an object localization network.
In the second stage, we propose a simple and effective method for evaluating the confidence of pseudo-labels based on classification discrimination.
arXiv Detail & Related papers (2022-03-18T09:05:51Z) - Self-Ensembling GAN for Cross-Domain Semantic Segmentation [107.27377745720243]
This paper proposes a self-ensembling generative adversarial network (SE-GAN) exploiting cross-domain data for semantic segmentation.
In SE-GAN, a teacher network and a student network constitute a self-ensembling model for generating semantic segmentation maps, which together with a discriminator, forms a GAN.
Despite its simplicity, we find SE-GAN can significantly boost the performance of adversarial training and enhance the stability of the model.
arXiv Detail & Related papers (2021-12-15T09:50:25Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Community detection using low-dimensional network embedding algorithms [1.052782170493037]
We rigorously understand the performance of two major algorithms, DeepWalk and node2vec, in recovering communities for canonical network models.
We prove that, given some fixed co-occurrence window, node2vec using random walks with a low non-backtracking probability can succeed for much sparser networks.
arXiv Detail & Related papers (2021-11-04T14:57:43Z) - Coarse to Fine: Domain Adaptive Crowd Counting via Adversarial Scoring
Network [58.05473757538834]
This paper proposes a novel adversarial scoring network (ASNet) to bridge the gap across domains from coarse to fine granularity.
Three sets of migration experiments show that the proposed methods achieve state-of-the-art counting performance.
arXiv Detail & Related papers (2021-07-27T14:47:24Z) - Joint Learning of Neural Transfer and Architecture Adaptation for Image
Recognition [77.95361323613147]
Current state-of-the-art visual recognition systems rely on pretraining a neural network on a large-scale dataset and finetuning the network weights on a smaller dataset.
In this work, we prove that dynamically adapting network architectures tailored for each domain task along with weight finetuning benefits in both efficiency and effectiveness.
Our method can be easily generalized to an unsupervised paradigm by replacing supernet training with self-supervised learning in the source domain tasks and performing linear evaluation in the downstream tasks.
arXiv Detail & Related papers (2021-03-31T08:15:17Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.