Improving Unsupervised Domain Adaptation by Reducing Bi-level Feature
Redundancy
- URL: http://arxiv.org/abs/2012.15732v1
- Date: Mon, 28 Dec 2020 08:00:56 GMT
- Title: Improving Unsupervised Domain Adaptation by Reducing Bi-level Feature
Redundancy
- Authors: Mengzhu Wang, Xiang Zhang, Long Lan, Wei Wang, Huibin Tan, Zhigang Luo
- Abstract summary: In this paper, we emphasize the significance of reducing feature redundancy for improving UDA in a bi-level way.
For the first level, we try to ensure compact domain-specific features with a transferable decorrelated normalization module.
In the second level, domain-invariant feature redundancy caused by domain-shared representation is further mitigated via an alternative brandity.
- Score: 14.94720207222332
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reducing feature redundancy has shown beneficial effects for improving the
accuracy of deep learning models, thus it is also indispensable for the models
of unsupervised domain adaptation (UDA). Nevertheless, most recent efforts in
the field of UDA ignores this point. Moreover, main schemes realizing this in
general independent of UDA purely involve a single domain, thus might not be
effective for cross-domain tasks. In this paper, we emphasize the significance
of reducing feature redundancy for improving UDA in a bi-level way. For the
first level, we try to ensure compact domain-specific features with a
transferable decorrelated normalization module, which preserves specific domain
information whilst easing the side effect of feature redundancy on the sequel
domain-invariance. In the second level, domain-invariant feature redundancy
caused by domain-shared representation is further mitigated via an alternative
brand orthogonality for better generalization. These two novel aspects can be
easily plugged into any BN-based backbone neural networks. Specifically, simply
applying them to ResNet50 has achieved competitive performance to the
state-of-the-arts on five popular benchmarks. Our code will be available at
https://github.com/dreamkily/gUDA.
Related papers
- Contrastive Adversarial Training for Unsupervised Domain Adaptation [2.432037584128226]
Domain adversarial training has been successfully adopted for various domain adaptation tasks.
Large models make adversarial training being easily biased towards source domain and hardly adapted to target domain.
We propose contrastive adversarial training (CAT) approach that leverages the labeled source domain samples to reinforce and regulate the feature generation for target domain.
arXiv Detail & Related papers (2024-07-17T17:59:21Z) - Gradually Vanishing Gap in Prototypical Network for Unsupervised Domain Adaptation [32.58201185195226]
We propose an efficient UDA framework named Gradually Vanishing Gap in Prototypical Network (GVG-PN)
Our model achieves transfer learning from both global and local perspectives.
Experiments on several UDA benchmarks validated that the proposed GVG-PN can clearly outperform the SOTA models.
arXiv Detail & Related papers (2024-05-28T03:03:32Z) - Adversarial Bi-Regressor Network for Domain Adaptive Regression [52.5168835502987]
It is essential to learn a cross-domain regressor to mitigate the domain shift.
This paper proposes a novel method Adversarial Bi-Regressor Network (ABRNet) to seek more effective cross-domain regression model.
arXiv Detail & Related papers (2022-09-20T18:38:28Z) - FRIDA -- Generative Feature Replay for Incremental Domain Adaptation [34.00059350161178]
We propose a novel framework called Feature based Incremental Domain Adaptation (FRIDA)
For domain alignment, we propose a simple extension of the popular domain adversarial neural network (DANN) called DANN-IB.
Experiment results on Office-Home, Office-CalTech, and DomainNet datasets confirm that FRIDA maintains superior stability-plasticity trade-off than the literature.
arXiv Detail & Related papers (2021-12-28T22:24:32Z) - Stagewise Unsupervised Domain Adaptation with Adversarial Self-Training
for Road Segmentation of Remote Sensing Images [93.50240389540252]
Road segmentation from remote sensing images is a challenging task with wide ranges of application potentials.
We propose a novel stagewise domain adaptation model called RoadDA to address the domain shift (DS) issue in this field.
Experiment results on two benchmarks demonstrate that RoadDA can efficiently reduce the domain gap and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2021-08-28T09:29:14Z) - A New Bidirectional Unsupervised Domain Adaptation Segmentation
Framework [27.13101555533594]
unsupervised domain adaptation (UDA) techniques are proposed to bridge the gap between different domains.
In this paper, we propose a bidirectional UDA framework based on disentangled representation learning for equally competent two-way UDA performances.
arXiv Detail & Related papers (2021-08-18T05:25:11Z) - Feature Alignment and Restoration for Domain Generalization and
Adaptation [93.39253443415392]
Cross domain feature alignment has been widely explored to pull the feature distributions of different domains in order to learn domain-invariant representations.
We propose a unified framework termed Feature Alignment and Restoration (FAR) to simultaneously ensure high generalization and discrimination power of the networks.
Experiments on multiple classification benchmarks demonstrate the high performance and strong generalization of our FAR framework for both domain generalization and unsupervised domain adaptation.
arXiv Detail & Related papers (2020-06-22T05:08:13Z) - Domain Conditioned Adaptation Network [90.63261870610211]
We propose a Domain Conditioned Adaptation Network (DCAN) to excite distinct convolutional channels with a domain conditioned channel attention mechanism.
This is the first work to explore the domain-wise convolutional channel activation for deep DA networks.
arXiv Detail & Related papers (2020-05-14T04:23:24Z) - Deep Residual Correction Network for Partial Domain Adaptation [79.27753273651747]
Deep domain adaptation methods have achieved appealing performance by learning transferable representations from a well-labeled source domain to a different but related unlabeled target domain.
This paper proposes an efficiently-implemented Deep Residual Correction Network (DRCN)
Comprehensive experiments on partial, traditional and fine-grained cross-domain visual recognition demonstrate that DRCN is superior to the competitive deep domain adaptation approaches.
arXiv Detail & Related papers (2020-04-10T06:07:16Z) - Supervised Domain Adaptation using Graph Embedding [86.3361797111839]
Domain adaptation methods assume that distributions between the two domains are shifted and attempt to realign them.
We propose a generic framework based on graph embedding.
We show that the proposed approach leads to a powerful Domain Adaptation framework.
arXiv Detail & Related papers (2020-03-09T12:25:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.