Enhancing cross-domain detection: adaptive class-aware contrastive
transformer
- URL: http://arxiv.org/abs/2401.13264v1
- Date: Wed, 24 Jan 2024 07:11:05 GMT
- Title: Enhancing cross-domain detection: adaptive class-aware contrastive
transformer
- Authors: Ziru Zeng, Yue Ding, Hongtao Lu
- Abstract summary: Insufficient labels in the target domain exacerbate issues of class imbalance and model performance degradation.
We propose a class-aware cross domain detection transformer based on the adversarial learning and mean-teacher framework.
- Score: 15.666766743738531
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently,the detection transformer has gained substantial attention for its
inherent minimal post-processing requirement.However,this paradigm relies on
abundant training data,yet in the context of the cross-domain
adaptation,insufficient labels in the target domain exacerbate issues of class
imbalance and model performance degradation.To address these challenges, we
propose a novel class-aware cross domain detection transformer based on the
adversarial learning and mean-teacher framework.First,considering the
inconsistencies between the classification and regression tasks,we introduce an
IoU-aware prediction branch and exploit the consistency of classification and
location scores to filter and reweight pseudo labels.Second, we devise a
dynamic category threshold refinement to adaptively manage model
confidence.Third,to alleviate the class imbalance,an instance-level class-aware
contrastive learning module is presented to encourage the generation of
discriminative features for each class,particularly benefiting minority
classes.Experimental results across diverse domain-adaptive scenarios validate
our method's effectiveness in improving performance and alleviating class
imbalance issues,which outperforms the state-of-the-art transformer based
methods.
Related papers
- Adaptive Cascading Network for Continual Test-Time Adaptation [12.718826132518577]
We study the problem of continual test-time adaption where the goal is to adapt a source pre-trained model to a sequence of unlabelled target domains at test time.
Existing methods on test-time training suffer from several limitations.
arXiv Detail & Related papers (2024-07-17T01:12:57Z) - Toward Multi-class Anomaly Detection: Exploring Class-aware Unified Model against Inter-class Interference [67.36605226797887]
We introduce a Multi-class Implicit Neural representation Transformer for unified Anomaly Detection (MINT-AD)
By learning the multi-class distributions, the model generates class-aware query embeddings for the transformer decoder.
MINT-AD can project category and position information into a feature embedding space, further supervised by classification and prior probability loss functions.
arXiv Detail & Related papers (2024-03-21T08:08:31Z) - Robust Class-Conditional Distribution Alignment for Partial Domain
Adaptation [0.7892577704654171]
Unwanted samples from private source categories in the learning objective of a partial domain adaptation setup can lead to negative transfer and reduce classification performance.
Existing methods, such as re-weighting or aggregating target predictions, are vulnerable to this issue.
Our proposed approach seeks to overcome these limitations by delving deeper than just the first-order moments to derive distinct and compact categorical distributions.
arXiv Detail & Related papers (2023-10-18T15:49:46Z) - Cluster-Guided Semi-Supervised Domain Adaptation for Imbalanced Medical
Image Classification [10.92984910426756]
We develop a semi-supervised domain adaptation method, which has robustness to class-imbalanced situations.
For robustness, we propose a weakly-supervised clustering pipeline to obtain high-purity clusters.
The proposed method showed state-of-the-art performance in the experiment using severely class-imbalanced pathological image patches.
arXiv Detail & Related papers (2023-03-02T14:07:36Z) - MemSAC: Memory Augmented Sample Consistency for Large Scale Unsupervised
Domain Adaptation [71.4942277262067]
We propose MemSAC, which exploits sample level similarity across source and target domains to achieve discriminative transfer.
We provide in-depth analysis and insights into the effectiveness of MemSAC.
arXiv Detail & Related papers (2022-07-25T17:55:28Z) - General Incremental Learning with Domain-aware Categorical
Representations [37.68376996568006]
We develop a novel domain-aware continual learning method based on the EM framework.
Specifically, we introduce a flexible class representation based on the von Mises-Fisher mixture model to capture the intra-class structure.
We design a bi-level balanced memory to cope with data imbalances within and across classes, which combines with a distillation loss to achieve better inter- and intra-class stability-plasticity trade-off.
arXiv Detail & Related papers (2022-04-08T13:57:33Z) - AFAN: Augmented Feature Alignment Network for Cross-Domain Object
Detection [90.18752912204778]
Unsupervised domain adaptation for object detection is a challenging problem with many real-world applications.
We propose a novel augmented feature alignment network (AFAN) which integrates intermediate domain image generation and domain-adversarial training.
Our approach significantly outperforms the state-of-the-art methods on standard benchmarks for both similar and dissimilar domain adaptations.
arXiv Detail & Related papers (2021-06-10T05:01:20Z) - Gradient Regularized Contrastive Learning for Continual Domain
Adaptation [86.02012896014095]
We study the problem of continual domain adaptation, where the model is presented with a labelled source domain and a sequence of unlabelled target domains.
We propose Gradient Regularized Contrastive Learning (GRCL) to solve the obstacles.
Experiments on Digits, DomainNet and Office-Caltech benchmarks demonstrate the strong performance of our approach.
arXiv Detail & Related papers (2021-03-23T04:10:42Z) - Selective Pseudo-Labeling with Reinforcement Learning for
Semi-Supervised Domain Adaptation [116.48885692054724]
We propose a reinforcement learning based selective pseudo-labeling method for semi-supervised domain adaptation.
We develop a deep Q-learning model to select both accurate and representative pseudo-labeled instances.
Our proposed method is evaluated on several benchmark datasets for SSDA, and demonstrates superior performance to all the comparison methods.
arXiv Detail & Related papers (2020-12-07T03:37:38Z) - Target Consistency for Domain Adaptation: when Robustness meets
Transferability [8.189696720657247]
Learning Invariant Representations has been successfully applied for reconciling a source and a target domain for Unsupervised Domain Adaptation.
We show that the cluster assumption is violated in the target domain despite being maintained in the source domain.
Our new approach results in a significant improvement, on both image classification and segmentation benchmarks.
arXiv Detail & Related papers (2020-06-25T09:13:00Z) - Contradictory Structure Learning for Semi-supervised Domain Adaptation [67.89665267469053]
Current adversarial adaptation methods attempt to align the cross-domain features.
Two challenges remain unsolved: 1) the conditional distribution mismatch and 2) the bias of the decision boundary towards the source domain.
We propose a novel framework for semi-supervised domain adaptation by unifying the learning of opposite structures.
arXiv Detail & Related papers (2020-02-06T22:58:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.