OurDB: Ouroboric Domain Bridging for Multi-Target Domain Adaptive Semantic Segmentation
- URL: http://arxiv.org/abs/2403.11582v1
- Date: Mon, 18 Mar 2024 08:55:48 GMT
- Title: OurDB: Ouroboric Domain Bridging for Multi-Target Domain Adaptive Semantic Segmentation
- Authors: Seungbeom Woo, Geonwoo Baek, Taehoon Kim, Jaemin Na, Joong-won Hwang, Wonjun Hwang,
- Abstract summary: Multi-target domain adaptation (MTDA) for semantic segmentation poses a significant challenge, as it involves multiple target domains with varying distributions.
Previous MTDA approaches typically employ multiple teacher architectures, where each teacher specializes in one target domain to simplify the task.
We propose an ouroboric domain bridging (OurDB) framework, offering an efficient solution to the MTDA problem using a single teacher architecture.
- Score: 8.450397069717727
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-target domain adaptation (MTDA) for semantic segmentation poses a significant challenge, as it involves multiple target domains with varying distributions. The goal of MTDA is to minimize the domain discrepancies among a single source and multi-target domains, aiming to train a single model that excels across all target domains. Previous MTDA approaches typically employ multiple teacher architectures, where each teacher specializes in one target domain to simplify the task. However, these architectures hinder the student model from fully assimilating comprehensive knowledge from all target-specific teachers and escalate training costs with increasing target domains. In this paper, we propose an ouroboric domain bridging (OurDB) framework, offering an efficient solution to the MTDA problem using a single teacher architecture. This framework dynamically cycles through multiple target domains, aligning each domain individually to restrain the biased alignment problem, and utilizes Fisher information to minimize the forgetting of knowledge from previous target domains. We also propose a context-guided class-wise mixup (CGMix) that leverages contextual information tailored to diverse target contexts in MTDA. Experimental evaluations conducted on four urban driving datasets (i.e., GTA5, Cityscapes, IDD, and Mapillary) demonstrate the superiority of our method over existing state-of-the-art approaches.
Related papers
- Decoupled Training: Return of Frustratingly Easy Multi-Domain Learning [20.17925272562433]
Multi-domain learning aims to train a model with minimal average risk across multiple overlapping but non-identical domains.
We propose Decoupled Training (D-Train) as a frustratingly easy and hyper parameter-free multi-domain learning method.
D-Train is a tri-phase general-to-specific training strategy that first pre-trains on all domains to warm up a root model, then post-trains on each domain by splitting into multi-heads, and finally fine-tunes the heads by fixing the backbone.
arXiv Detail & Related papers (2023-09-19T04:06:41Z) - ML-BPM: Multi-teacher Learning with Bidirectional Photometric Mixing for
Open Compound Domain Adaptation in Semantic Segmentation [78.19743899703052]
Open compound domain adaptation (OCDA) considers the target domain as the compound of multiple unknown homogeneous.
We introduce a multi-teacher framework with bidirectional photometric mixing to adapt to every target subdomain.
We conduct an adaptive distillation to learn a student model and apply consistency regularization to improve the student generalization.
arXiv Detail & Related papers (2022-07-19T03:30:48Z) - Multi-Head Distillation for Continual Unsupervised Domain Adaptation in
Semantic Segmentation [38.10483890861357]
This work focuses on a novel framework for learning UDA, continuous UDA, in which models operate on multiple target domains discovered sequentially.
We propose MuHDi, for Multi-Head Distillation, a method that solves the catastrophic forgetting problem, inherent in continual learning tasks.
arXiv Detail & Related papers (2022-04-25T14:03:09Z) - Multi-Source Unsupervised Domain Adaptation via Pseudo Target Domain [0.0]
Multi-source domain adaptation (MDA) aims to transfer knowledge from multiple source domains to an unlabeled target domain.
We propose a novel MDA approach, termed Pseudo Target for MDA (PTMDA)
PTMDA maps each group of source and target domains into a group-specific subspace using adversarial learning with a metric constraint.
We show that PTMDA as a whole can reduce the target error bound and leads to a better approximation of the target risk in MDA settings.
arXiv Detail & Related papers (2022-02-22T08:37:16Z) - META: Mimicking Embedding via oThers' Aggregation for Generalizable
Person Re-identification [68.39849081353704]
Domain generalizable (DG) person re-identification (ReID) aims to test across unseen domains without access to the target domain data at training time.
This paper presents a new approach called Mimicking Embedding via oThers' Aggregation (META) for DG ReID.
arXiv Detail & Related papers (2021-12-16T08:06:50Z) - Multi-Target Adversarial Frameworks for Domain Adaptation in Semantic
Segmentation [32.39557675340562]
We address the task of unsupervised domain adaptation (UDA) for semantic segmentation in presence of multiple target domains.
We introduce two adversarial frameworks: (i) multi-discriminator, which explicitly aligns each target domain to its counterparts, and (ii) multi-target knowledge transfer, which learns a target-agnostic model.
In all tested scenarios, our approaches consistently outperform baselines, setting competitive standards for the novel task.
arXiv Detail & Related papers (2021-08-16T08:36:10Z) - Multi-Target Domain Adaptation with Collaborative Consistency Learning [105.7615147382486]
We propose a collaborative learning framework to achieve unsupervised multi-target domain adaptation.
The proposed method can effectively exploit rich structured information contained in both labeled source domain and multiple unlabeled target domains.
arXiv Detail & Related papers (2021-06-07T08:36:20Z) - Curriculum Graph Co-Teaching for Multi-Target Domain Adaptation [78.28390172958643]
We identify two key aspects that can help to alleviate multiple domain-shifts in the multi-target domain adaptation (MTDA)
We propose Curriculum Graph Co-Teaching (CGCT) that uses a dual classifier head, with one of them being a graph convolutional network (GCN) which aggregates features from similar samples across the domains.
When the domain labels are available, we propose Domain-aware Curriculum Learning (DCL), a sequential adaptation strategy that first adapts on the easier target domains, followed by the harder ones.
arXiv Detail & Related papers (2021-04-01T23:41:41Z) - Cluster, Split, Fuse, and Update: Meta-Learning for Open Compound Domain
Adaptive Semantic Segmentation [102.42638795864178]
We propose a principled meta-learning based approach to OCDA for semantic segmentation.
We cluster target domain into multiple sub-target domains by image styles, extracted in an unsupervised manner.
A meta-learner is thereafter deployed to learn to fuse sub-target domain-specific predictions, conditioned upon the style code.
We learn to online update the model by model-agnostic meta-learning (MAML) algorithm, thus to further improve generalization.
arXiv Detail & Related papers (2020-12-15T13:21:54Z) - Unsupervised Multi-Target Domain Adaptation Through Knowledge
Distillation [14.088776449829345]
Unsupervised domain adaptation (UDA) seeks to alleviate the problem of domain shift between the distribution of unlabeled data.
In this paper, we propose a novel unsupervised MTDA approach to train a CNN that can generalize well across multiple target domains.
arXiv Detail & Related papers (2020-07-14T14:59:45Z) - MADAN: Multi-source Adversarial Domain Aggregation Network for Domain
Adaptation [58.38749495295393]
Domain adaptation aims to learn a transferable model to bridge the domain shift between one labeled source domain and another sparsely labeled or unlabeled target domain.
Recent multi-source domain adaptation (MDA) methods do not consider the pixel-level alignment between sources and target.
We propose a novel MDA framework to address these challenges.
arXiv Detail & Related papers (2020-02-19T21:22:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.