Implicit Bayes Adaptation: A Collaborative Transport Approach
- URL: http://arxiv.org/abs/2304.08298v1
- Date: Mon, 17 Apr 2023 14:13:40 GMT
- Title: Implicit Bayes Adaptation: A Collaborative Transport Approach
- Authors: Bo Jiang, Hamid Krim, Tianfu Wu, Derya Cansever
- Abstract summary: We show that domain adaptation is rooted in the intrinsic representations of the respective data, which are inherently lying in a non-linear submanifold embedded in a higher dimensional Euclidean space.
We show that this is tantamount to an implicit Bayesian framework, which we demonstrate to be viable for a more robust and better-performing approach to domain adaptation.
- Score: 25.96406219707398
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The power and flexibility of Optimal Transport (OT) have pervaded a wide
spectrum of problems, including recent Machine Learning challenges such as
unsupervised domain adaptation. Its essence of quantitatively relating two
probability distributions by some optimal metric, has been creatively exploited
and shown to hold promise for many real-world data challenges. In a related
theme in the present work, we posit that domain adaptation robustness is rooted
in the intrinsic (latent) representations of the respective data, which are
inherently lying in a non-linear submanifold embedded in a higher dimensional
Euclidean space. We account for the geometric properties by refining the $l^2$
Euclidean metric to better reflect the geodesic distance between two distinct
representations. We integrate a metric correction term as well as a prior
cluster structure in the source data of the OT-driven adaptation. We show that
this is tantamount to an implicit Bayesian framework, which we demonstrate to
be viable for a more robust and better-performing approach to domain
adaptation. Substantiating experiments are also included for validation
purposes.
Related papers
- Distributed Variational Inference for Online Supervised Learning [15.038649101409804]
This paper develops a scalable distributed probabilistic inference algorithm.
It applies to continuous variables, intractable posteriors and large-scale real-time data in sensor networks.
arXiv Detail & Related papers (2023-09-05T22:33:02Z) - Parameter Estimation in DAGs from Incomplete Data via Optimal Transport [24.740382124473975]
We develop a theoretical framework and support it with extensive empirical evidence demonstrating the robustness and versatility of our approach.
We show that not only can our method effectively recover the ground-truth parameters but it also performs comparably or better than competing baselines on downstream applications.
arXiv Detail & Related papers (2023-05-25T10:54:36Z) - SALUDA: Surface-based Automotive Lidar Unsupervised Domain Adaptation [62.889835139583965]
We introduce an unsupervised auxiliary task of learning an implicit underlying surface representation simultaneously on source and target data.
As both domains share the same latent representation, the model is forced to accommodate discrepancies between the two sources of data.
Our experiments demonstrate that our method achieves a better performance than the current state of the art, both in real-to-real and synthetic-to-real scenarios.
arXiv Detail & Related papers (2023-04-06T17:36:23Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - CAusal and collaborative proxy-tasKs lEarning for Semi-Supervised Domain
Adaptation [20.589323508870592]
Semi-supervised domain adaptation (SSDA) adapts a learner to a new domain by effectively utilizing source domain data and a few labeled target samples.
We show that the proposed model significantly outperforms SOTA methods in terms of effectiveness and generalisability on SSDA datasets.
arXiv Detail & Related papers (2023-03-30T16:48:28Z) - A Bit More Bayesian: Domain-Invariant Learning with Uncertainty [111.22588110362705]
Domain generalization is challenging due to the domain shift and the uncertainty caused by the inaccessibility of target domain data.
In this paper, we address both challenges with a probabilistic framework based on variational Bayesian inference.
We derive domain-invariant representations and classifiers, which are jointly established in a two-layer Bayesian neural network.
arXiv Detail & Related papers (2021-05-09T21:33:27Z) - GELATO: Geometrically Enriched Latent Model for Offline Reinforcement
Learning [54.291331971813364]
offline reinforcement learning approaches can be divided into proximal and uncertainty-aware methods.
In this work, we demonstrate the benefit of combining the two in a latent variational model.
Our proposed metrics measure both the quality of out of distribution samples as well as the discrepancy of examples in the data.
arXiv Detail & Related papers (2021-02-22T19:42:40Z) - Robust Bayesian Inference for Discrete Outcomes with the Total Variation
Distance [5.139874302398955]
Models of discrete-valued outcomes are easily misspecified if the data exhibit zero-inflation, overdispersion or contamination.
Here, we introduce a robust discrepancy-based Bayesian approach using the Total Variation Distance (TVD)
We empirically demonstrate that our approach is robust and significantly improves predictive performance on a range of simulated and real world data.
arXiv Detail & Related papers (2020-10-26T09:53:06Z) - Robust Optimal Transport with Applications in Generative Modeling and
Domain Adaptation [120.69747175899421]
Optimal Transport (OT) distances such as Wasserstein have been used in several areas such as GANs and domain adaptation.
We propose a computationally-efficient dual form of the robust OT optimization that is amenable to modern deep learning applications.
Our approach can train state-of-the-art GAN models on noisy datasets corrupted with outlier distributions.
arXiv Detail & Related papers (2020-10-12T17:13:40Z) - Learning Invariant Representations and Risks for Semi-supervised Domain
Adaptation [109.73983088432364]
We propose the first method that aims to simultaneously learn invariant representations and risks under the setting of semi-supervised domain adaptation (Semi-DA)
We introduce the LIRR algorithm for jointly textbfLearning textbfInvariant textbfRepresentations and textbfRisks.
arXiv Detail & Related papers (2020-10-09T15:42:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.