Bi-level Unbalanced Optimal Transport for Partial Domain Adaptation
- URL: http://arxiv.org/abs/2506.08020v1
- Date: Mon, 19 May 2025 13:40:40 GMT
- Title: Bi-level Unbalanced Optimal Transport for Partial Domain Adaptation
- Authors: Zi-Ying Chen, Chuan-Xian Ren, Hong Yan,
- Abstract summary: Partial domain adaptation problem requires aligning cross-domain samples while distinguishing the outlier classes for accurate knowledge transfer.<n>We propose a Bi-level Unbalanced Optimal Transport model to simultaneously characterize the sample-wise and class-wise relations.
- Score: 15.32873543470471
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Partial domain adaptation (PDA) problem requires aligning cross-domain samples while distinguishing the outlier classes for accurate knowledge transfer. The widely used weighting framework tries to address the outlier classes by introducing the reweighed source domain with a similar label distribution to the target domain. However, the empirical modeling of weights can only characterize the sample-wise relations, which leads to insufficient exploration of cluster structures, and the weights could be sensitive to the inaccurate prediction and cause confusion on the outlier classes. To tackle these issues, we propose a Bi-level Unbalanced Optimal Transport (BUOT) model to simultaneously characterize the sample-wise and class-wise relations in a unified transport framework. Specifically, a cooperation mechanism between sample-level and class-level transport is introduced, where the sample-level transport provides essential structure information for the class-level knowledge transfer, while the class-level transport supplies discriminative information for the outlier identification. The bi-level transport plan provides guidance for the alignment process. By incorporating the label-aware transport cost, the local transport structure is ensured and a fast computation formulation is derived to improve the efficiency. Extensive experiments on benchmark datasets validate the competitiveness of BUOT.
Related papers
- DART: Dual Adaptive Refinement Transfer for Open-Vocabulary Multi-Label Recognition [59.203152078315235]
Open-Vocabulary Multi-Label Recognition (OV-MLR) aims to identify multiple seen and unseen object categories within an image.<n> Vision-Language Pre-training models offer a strong open-vocabulary foundation, but struggle with fine-grained localization under weak supervision.<n>We propose the Dual Adaptive Refinement Transfer (DART) framework to overcome these limitations.
arXiv Detail & Related papers (2025-08-07T17:22:33Z) - A Class-aware Optimal Transport Approach with Higher-Order Moment
Matching for Unsupervised Domain Adaptation [33.712557972990034]
Unsupervised domain adaptation (UDA) aims to transfer knowledge from a labeled source domain to an unlabeled target domain.
We introduce a novel approach called class-aware optimal transport (OT), which measures the OT distance between a distribution over the source class-conditional distributions.
arXiv Detail & Related papers (2024-01-29T08:27:31Z) - Chaos to Order: A Label Propagation Perspective on Source-Free Domain
Adaptation [8.27771856472078]
We present Chaos to Order (CtO), a novel approach for source-free domain adaptation (SFDA)
CtO strives to constrain semantic credibility and propagate label information among target subpopulations.
Empirical evidence demonstrates that CtO outperforms the state of the arts on three public benchmarks.
arXiv Detail & Related papers (2023-01-20T03:39:35Z) - Adaptive Distribution Calibration for Few-Shot Learning with
Hierarchical Optimal Transport [78.9167477093745]
We propose a novel distribution calibration method by learning the adaptive weight matrix between novel samples and base classes.
Experimental results on standard benchmarks demonstrate that our proposed plug-and-play model outperforms competing approaches.
arXiv Detail & Related papers (2022-10-09T02:32:57Z) - InfoOT: Information Maximizing Optimal Transport [58.72713603244467]
InfoOT is an information-theoretic extension of optimal transport.
It maximizes the mutual information between domains while minimizing geometric distances.
This formulation yields a new projection method that is robust to outliers and generalizes to unseen samples.
arXiv Detail & Related papers (2022-10-06T18:55:41Z) - MemSAC: Memory Augmented Sample Consistency for Large Scale Unsupervised
Domain Adaptation [71.4942277262067]
We propose MemSAC, which exploits sample level similarity across source and target domains to achieve discriminative transfer.
We provide in-depth analysis and insights into the effectiveness of MemSAC.
arXiv Detail & Related papers (2022-07-25T17:55:28Z) - Coupling Adversarial Learning with Selective Voting Strategy for
Distribution Alignment in Partial Domain Adaptation [6.5991141403378215]
Partial domain adaptation setup caters to a realistic scenario by relaxing the identical label set assumption.
We devise a mechanism for strategic selection of highly-confident target samples essential for the estimation of class-native weights.
arXiv Detail & Related papers (2022-07-17T11:34:56Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Hierarchical Optimal Transport for Unsupervised Domain Adaptation [0.0]
We propose a novel approach for unsupervised domain adaptation, that relates notions of optimal transport, learning probability measures and unsupervised learning.
The proposed approach, HOT-DA, is based on a hierarchical formulation of optimal transport.
Experiments on a toy dataset with controllable complexity and two challenging visual adaptation datasets show the superiority of the proposed approach over the state-of-the-art.
arXiv Detail & Related papers (2021-12-03T18:37:23Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - CSCL: Critical Semantic-Consistent Learning for Unsupervised Domain
Adaptation [42.226842513334184]
We develop a new Critical Semantic-Consistent Learning model, which mitigates the discrepancy of both domain-wise and category-wise distributions.
Specifically, a critical transfer based adversarial framework is designed to highlight transferable domain-wise knowledge while neglecting untransferable knowledge.
arXiv Detail & Related papers (2020-08-24T14:12:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.