Tackling Long-Tailed Category Distribution Under Domain Shifts
- URL: http://arxiv.org/abs/2207.10150v1
- Date: Wed, 20 Jul 2022 19:07:46 GMT
- Title: Tackling Long-Tailed Category Distribution Under Domain Shifts
- Authors: Xiao Gu, Yao Guo, Zeju Li, Jianing Qiu, Qi Dou, Yuxuan Liu, Benny Lo,
Guang-Zhong Yang
- Abstract summary: Existing approaches cannot handle the scenario where both issues exist.
We designed three novel core functional blocks including Distribution Calibrated Classification Loss, Visual-Semantic Mapping and Semantic-Similarity Guided Augmentation.
Two new datasets were proposed for this problem, named AWA2-LTS and ImageNet-LTS.
- Score: 50.21255304847395
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning models fail to perform well on real-world applications when
1) the category distribution P(Y) of the training dataset suffers from
long-tailed distribution and 2) the test data is drawn from different
conditional distributions P(X|Y). Existing approaches cannot handle the
scenario where both issues exist, which however is common for real-world
applications. In this study, we took a step forward and looked into the problem
of long-tailed classification under domain shifts. We designed three novel core
functional blocks including Distribution Calibrated Classification Loss,
Visual-Semantic Mapping and Semantic-Similarity Guided Augmentation.
Furthermore, we adopted a meta-learning framework which integrates these three
blocks to improve domain generalization on unseen target domains. Two new
datasets were proposed for this problem, named AWA2-LTS and ImageNet-LTS. We
evaluated our method on the two datasets and extensive experimental results
demonstrate that our proposed method can achieve superior performance over
state-of-the-art long-tailed/domain generalization approaches and the
combinations. Source codes and datasets can be found at our project page
https://xiaogu.site/LTDS.
Related papers
- SALUDA: Surface-based Automotive Lidar Unsupervised Domain Adaptation [62.889835139583965]
We introduce an unsupervised auxiliary task of learning an implicit underlying surface representation simultaneously on source and target data.
As both domains share the same latent representation, the model is forced to accommodate discrepancies between the two sources of data.
Our experiments demonstrate that our method achieves a better performance than the current state of the art, both in real-to-real and synthetic-to-real scenarios.
arXiv Detail & Related papers (2023-04-06T17:36:23Z) - Semi-supervised Domain Adaptive Structure Learning [72.01544419893628]
Semi-supervised domain adaptation (SSDA) is a challenging problem requiring methods to overcome both 1) overfitting towards poorly annotated data and 2) distribution shift across domains.
We introduce an adaptive structure learning method to regularize the cooperation of SSL and DA.
arXiv Detail & Related papers (2021-12-12T06:11:16Z) - Adaptive Hierarchical Dual Consistency for Semi-Supervised Left Atrium
Segmentation on Cross-Domain Data [8.645556125521246]
Generalising semi-supervised learning to cross-domain data is of high importance to improve model robustness.
The AHDC consists of a Bidirectional Adversarial Inference module (BAI) and a Hierarchical Dual Consistency learning module (HDC)
We demonstrate the performance of our proposed AHDC on four 3D late gadolinium enhancement cardiac MR (LGE-CMR) datasets from different centres and a 3D CT dataset.
arXiv Detail & Related papers (2021-09-17T02:15:10Z) - Source-Free Open Compound Domain Adaptation in Semantic Segmentation [99.82890571842603]
In SF-OCDA, only the source pre-trained model and the target data are available to learn the target model.
We propose the Cross-Patch Style Swap (CPSS) to diversify samples with various patch styles in the feature-level.
Our method produces state-of-the-art results on the C-Driving dataset.
arXiv Detail & Related papers (2021-06-07T08:38:41Z) - Semi-supervised Domain Adaptation based on Dual-level Domain Mixing for
Semantic Segmentation [34.790169990156684]
We focus on a more practical setting of semi-supervised domain adaptation (SSDA) where both a small set of labeled target data and large amounts of labeled source data are available.
Two kinds of data mixing methods are proposed to reduce domain gap in both region-level and sample-level respectively.
We can obtain two complementary domain-mixed teachers based on dual-level mixed data from holistic and partial views respectively.
arXiv Detail & Related papers (2021-03-08T12:33:17Z) - Domain Adaptation in LiDAR Semantic Segmentation by Aligning Class
Distributions [9.581605678437032]
This work addresses the problem of unsupervised domain adaptation for LiDAR semantic segmentation models.
Our approach combines novel ideas on top of the current state-of-the-art approaches and yields new state-of-the-art results.
arXiv Detail & Related papers (2020-10-23T08:52:15Z) - Cross-Domain Facial Expression Recognition: A Unified Evaluation
Benchmark and Adversarial Graph Learning [85.6386289476598]
We develop a novel adversarial graph representation adaptation (AGRA) framework for cross-domain holistic-local feature co-adaptation.
We conduct extensive and fair evaluations on several popular benchmarks and show that the proposed AGRA framework outperforms previous state-of-the-art methods.
arXiv Detail & Related papers (2020-08-03T15:00:31Z) - Learning to Match Distributions for Domain Adaptation [116.14838935146004]
This paper proposes Learning to Match (L2M) to automatically learn the cross-domain distribution matching.
L2M reduces the inductive bias by using a meta-network to learn the distribution matching loss in a data-driven way.
Experiments on public datasets substantiate the superiority of L2M over SOTA methods.
arXiv Detail & Related papers (2020-07-17T03:26:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.