Learning a Domain Classifier Bank for Unsupervised Adaptive Object
Detection
- URL: http://arxiv.org/abs/2007.02595v1
- Date: Mon, 6 Jul 2020 09:12:46 GMT
- Title: Learning a Domain Classifier Bank for Unsupervised Adaptive Object
Detection
- Authors: Sanli Tang, Zhanzhan Cheng, Shiliang Pu, Dashan Guo, Yi Niu and Fei Wu
- Abstract summary: In this paper, we propose a fine-grained domain alignment approach for object detectors based on deep networks.
We develop a bare object detector with the proposed fine-grained domain alignment mechanism as the adaptive detector.
Experiments on three popular transferring benchmarks demonstrate the effectiveness of our method.
- Score: 48.19258721979389
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In real applications, object detectors based on deep networks still face
challenges of the large domain gap between the labeled training data and
unlabeled testing data. To reduce the gap, recent techniques are proposed by
aligning the image/instance-level features between source and unlabeled target
domains. However, these methods suffer from the suboptimal problem mainly
because of ignoring the category information of object instances. To tackle
this issue, we develop a fine-grained domain alignment approach with a
well-designed domain classifier bank that achieves the instance-level alignment
respecting to their categories. Specifically, we first employ the mean teacher
paradigm to generate pseudo labels for unlabeled samples. Then we implement the
class-level domain classifiers and group them together, called domain
classifier bank, in which each domain classifier is responsible for aligning
features of a specific class. We assemble the bare object detector with the
proposed fine-grained domain alignment mechanism as the adaptive detector, and
optimize it with a developed crossed adaptive weighting mechanism. Extensive
experiments on three popular transferring benchmarks demonstrate the
effectiveness of our method and achieve the new remarkable state-of-the-arts.
Related papers
- DATR: Unsupervised Domain Adaptive Detection Transformer with Dataset-Level Adaptation and Prototypical Alignment [7.768332621617199]
We introduce a strong DETR-based detector named Domain Adaptive detection TRansformer ( DATR) for unsupervised domain adaptation of object detection.
Our proposed DATR incorporates a mean-teacher based self-training framework, utilizing pseudo-labels generated by the teacher model to further mitigate domain bias.
Experiments demonstrate superior performance and generalization capabilities of our proposed DATR in multiple domain adaptation scenarios.
arXiv Detail & Related papers (2024-05-20T03:48:45Z) - Domain Adaptation Using Pseudo Labels [16.79672078512152]
In the absence of labeled target data, unsupervised domain adaptation approaches seek to align the marginal distributions of the source and target domains.
We deploy a pretrained network to determine accurate labels for the target domain using a multi-stage pseudo-label refinement procedure.
Our results on multiple datasets demonstrate the effectiveness of our simple procedure in comparison with complex state-of-the-art techniques.
arXiv Detail & Related papers (2024-02-09T22:15:11Z) - Joint Distribution Alignment via Adversarial Learning for Domain
Adaptive Object Detection [11.262560426527818]
Unsupervised domain adaptive object detection aims to adapt a well-trained detector from its original source domain with rich labeled data to a new target domain with unlabeled data.
Recently, mainstream approaches perform this task through adversarial learning, yet still suffer from two limitations.
We propose a joint adaptive detection framework (JADF) to address the above challenges.
arXiv Detail & Related papers (2021-09-19T00:27:08Z) - AFAN: Augmented Feature Alignment Network for Cross-Domain Object
Detection [90.18752912204778]
Unsupervised domain adaptation for object detection is a challenging problem with many real-world applications.
We propose a novel augmented feature alignment network (AFAN) which integrates intermediate domain image generation and domain-adversarial training.
Our approach significantly outperforms the state-of-the-art methods on standard benchmarks for both similar and dissimilar domain adaptations.
arXiv Detail & Related papers (2021-06-10T05:01:20Z) - Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation [85.6961770631173]
In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.
We propose a novel approach called Cross-domain Adaptive Clustering to address this problem.
arXiv Detail & Related papers (2021-04-19T16:07:32Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Your Classifier can Secretly Suffice Multi-Source Domain Adaptation [72.47706604261992]
Multi-Source Domain Adaptation (MSDA) deals with the transfer of task knowledge from multiple labeled source domains to an unlabeled target domain.
We present a different perspective to MSDA wherein deep models are observed to implicitly align the domains under label supervision.
arXiv Detail & Related papers (2021-03-20T12:44:13Z) - Cross-domain Detection via Graph-induced Prototype Alignment [114.8952035552862]
We propose a Graph-induced Prototype Alignment (GPA) framework to seek for category-level domain alignment.
In addition, in order to alleviate the negative effect of class-imbalance on domain adaptation, we design a Class-reweighted Contrastive Loss.
Our approach outperforms existing methods with a remarkable margin.
arXiv Detail & Related papers (2020-03-28T17:46:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.