Learning Domain Invariant Representations by Joint Wasserstein Distance
Minimization
- URL: http://arxiv.org/abs/2106.04923v2
- Date: Mon, 21 Aug 2023 09:56:25 GMT
- Title: Learning Domain Invariant Representations by Joint Wasserstein Distance
Minimization
- Authors: L\'eo Andeol, Yusei Kawakami, Yuichiro Wada, Takafumi Kanamori,
Klaus-Robert M\"uller, Gr\'egoire Montavon
- Abstract summary: Domain shifts in the training data are common in practical applications of machine learning.
Ideally, a ML model should work well independently of these shifts, for example, by learning a domain-invariant representation.
Common ML losses do not give strong guarantees on how consistently the ML model performs for different domains.
- Score: 3.382067152367334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain shifts in the training data are common in practical applications of
machine learning; they occur for instance when the data is coming from
different sources. Ideally, a ML model should work well independently of these
shifts, for example, by learning a domain-invariant representation. However,
common ML losses do not give strong guarantees on how consistently the ML model
performs for different domains, in particular, whether the model performs well
on a domain at the expense of its performance on another domain. In this paper,
we build new theoretical foundations for this problem, by contributing a set of
mathematical relations between classical losses for supervised ML and the
Wasserstein distance in joint space (i.e. representation and output space). We
show that classification or regression losses, when combined with a GAN-type
discriminator between domains, form an upper-bound to the true Wasserstein
distance between domains. This implies a more invariant representation and also
more stable prediction performance across domains. Theoretical results are
corroborated empirically on several image datasets. Our proposed approach
systematically produces the highest minimum classification accuracy across
domains, and the most invariant representation.
Related papers
- Cross-Domain Policy Adaptation by Capturing Representation Mismatch [53.087413751430255]
It is vital to learn effective policies that can be transferred to different domains with dynamics discrepancies in reinforcement learning (RL)
In this paper, we consider dynamics adaptation settings where there exists dynamics mismatch between the source domain and the target domain.
We perform representation learning only in the target domain and measure the representation deviations on the transitions from the source domain.
arXiv Detail & Related papers (2024-05-24T09:06:12Z) - SALUDA: Surface-based Automotive Lidar Unsupervised Domain Adaptation [62.889835139583965]
We introduce an unsupervised auxiliary task of learning an implicit underlying surface representation simultaneously on source and target data.
As both domains share the same latent representation, the model is forced to accommodate discrepancies between the two sources of data.
Our experiments demonstrate that our method achieves a better performance than the current state of the art, both in real-to-real and synthetic-to-real scenarios.
arXiv Detail & Related papers (2023-04-06T17:36:23Z) - Improving Domain Generalization with Domain Relations [77.63345406973097]
This paper focuses on domain shifts, which occur when the model is applied to new domains that are different from the ones it was trained on.
We propose a new approach called D$3$G to learn domain-specific models.
Our results show that D$3$G consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-02-06T08:11:16Z) - Multi-Domain Long-Tailed Learning by Augmenting Disentangled
Representations [80.76164484820818]
There is an inescapable long-tailed class-imbalance issue in many real-world classification problems.
We study this multi-domain long-tailed learning problem and aim to produce a model that generalizes well across all classes and domains.
Built upon a proposed selective balanced sampling strategy, TALLY achieves this by mixing the semantic representation of one example with the domain-associated nuisances of another.
arXiv Detail & Related papers (2022-10-25T21:54:26Z) - Domain Adaptation for Time-Series Classification to Mitigate Covariate
Shift [3.071136270246468]
This paper proposes a novel supervised domain adaptation based on two steps.
First, we search for an optimal class-dependent transformation from the source to the target domain from a few samples.
Second, we use embedding similarity techniques to select the corresponding transformation at inference.
arXiv Detail & Related papers (2022-04-07T10:27:14Z) - Domain-Class Correlation Decomposition for Generalizable Person
Re-Identification [34.813965300584776]
In person re-identification, the domain and class are correlated.
We show that domain adversarial learning will lose certain information about class due to this domain-class correlation.
Our model outperforms the state-of-the-art methods on the large-scale domain generalization Re-ID benchmark.
arXiv Detail & Related papers (2021-06-29T09:45:03Z) - Heuristic Domain Adaptation [105.59792285047536]
Heuristic Domain Adaptation Network (HDAN) explicitly learns the domain-invariant and domain-specific representations.
Heuristic Domain Adaptation Network (HDAN) has exceeded state-of-the-art on unsupervised DA, multi-source DA and semi-supervised DA.
arXiv Detail & Related papers (2020-11-30T04:21:35Z) - Batch Normalization Embeddings for Deep Domain Generalization [50.51405390150066]
Domain generalization aims at training machine learning models to perform robustly across different and unseen domains.
We show a significant increase in classification accuracy over current state-of-the-art techniques on popular domain generalization benchmarks.
arXiv Detail & Related papers (2020-11-25T12:02:57Z) - DIRL: Domain-Invariant Representation Learning for Sim-to-Real Transfer [2.119586259941664]
We present a domain-invariant representation learning (DIRL) algorithm to adapt deep models to the physical environment with a small amount of real data.
Experiments on digit domains yield state-of-the-art performance on challenging benchmarks.
arXiv Detail & Related papers (2020-11-15T17:39:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.