DAPDAG: Domain Adaptation via Perturbed DAG Reconstruction
- URL: http://arxiv.org/abs/2208.01373v1
- Date: Tue, 2 Aug 2022 11:43:03 GMT
- Title: DAPDAG: Domain Adaptation via Perturbed DAG Reconstruction
- Authors: Yanke Li, Hatt Tobias, Ioana Bica, Mihaela van der Schaar
- Abstract summary: We learn an auto-encoder that undertakes inference on population statistics given features and reconstruct a directed acyclic graph (DAG) as an auxiliary task.
The underlying DAG structure is assumed invariant among observed variables whose conditional distributions are allowed to vary across domains led by a latent environmental variable $E$.
We train the encoder and decoder jointly in an end-to-end manner and conduct experiments on synthetic and real datasets with mixed variables.
- Score: 78.76115370275733
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Leveraging labelled data from multiple domains to enable prediction in
another domain without labels is a significant, yet challenging problem. To
address this problem, we introduce the framework DAPDAG (\textbf{D}omain
\textbf{A}daptation via \textbf{P}erturbed \textbf{DAG} Reconstruction) and
propose to learn an auto-encoder that undertakes inference on population
statistics given features and reconstructing a directed acyclic graph (DAG) as
an auxiliary task. The underlying DAG structure is assumed invariant among
observed variables whose conditional distributions are allowed to vary across
domains led by a latent environmental variable $E$. The encoder is designed to
serve as an inference device on $E$ while the decoder reconstructs each
observed variable conditioned on its graphical parents in the DAG and the
inferred $E$. We train the encoder and decoder jointly in an end-to-end manner
and conduct experiments on synthetic and real datasets with mixed variables.
Empirical results demonstrate that reconstructing the DAG benefits the
approximate inference. Furthermore, our approach can achieve competitive
performance against other benchmarks in prediction tasks, with better
adaptation ability, especially in the target domain significantly different
from the source domains.
Related papers
- Protect Before Generate: Error Correcting Codes within Discrete Deep Generative Models [3.053842954605396]
We introduce a novel method that enhances variational inference in discrete latent variable models.
We leverage Error Correcting Codes (ECCs) to introduce redundancy in the latent representations.
This redundancy is then exploited by the variational posterior to yield more accurate estimates.
arXiv Detail & Related papers (2024-10-10T11:59:58Z) - Disentangling Masked Autoencoders for Unsupervised Domain Generalization [57.56744870106124]
Unsupervised domain generalization is fast gaining attention but is still far from well-studied.
Disentangled Masked Auto (DisMAE) aims to discover the disentangled representations that faithfully reveal intrinsic features.
DisMAE co-trains the asymmetric dual-branch architecture with semantic and lightweight variation encoders.
arXiv Detail & Related papers (2024-07-10T11:11:36Z) - Geodesic Optimization for Predictive Shift Adaptation on EEG data [53.58711912565724]
Domain adaptation methods struggle when distribution shifts occur simultaneously in $X$ and $y$.
This paper proposes a novel method termed Geodesic Optimization for Predictive Shift Adaptation (GOPSA) to address test-time multi-source DA.
GOPSA has the potential to combine the advantages of mixed-effects modeling with machine learning for biomedical applications of EEG.
arXiv Detail & Related papers (2024-07-04T12:15:42Z) - Semi Supervised Heterogeneous Domain Adaptation via Disentanglement and Pseudo-Labelling [4.33404822906643]
Semi-supervised domain adaptation methods leverage information from a source labelled domain to generalize over a scarcely labelled target domain.
Such a setting is denoted as Semi-Supervised Heterogeneous Domain Adaptation (SSHDA)
We introduce SHeDD (Semi-supervised Heterogeneous Domain Adaptation via Disentanglement) an end-to-end neural framework tailored to learning a target domain.
arXiv Detail & Related papers (2024-06-20T08:02:49Z) - Cross Domain Generative Augmentation: Domain Generalization with Latent
Diffusion Models [11.309433257851122]
Cross Domain Generative Augmentation (CDGA) generates synthetic images to fill the gap between all domains.
We show that CDGA outperforms SOTA DG methods under the Domainbed benchmark.
arXiv Detail & Related papers (2023-12-08T21:52:00Z) - Structural transfer learning of non-Gaussian DAG [24.11895013147964]
Directed acyclic graph (DAG) has been widely employed to represent directional relationships among a set of collected nodes.
It remains an open question how to pool the heterogeneous data together for better DAG structure reconstruction in the target study.
We introduce a novel set of structural similarity measures for DAG and then present a transfer DAG learning framework.
arXiv Detail & Related papers (2023-10-16T10:01:27Z) - CAusal and collaborative proxy-tasKs lEarning for Semi-Supervised Domain
Adaptation [20.589323508870592]
Semi-supervised domain adaptation (SSDA) adapts a learner to a new domain by effectively utilizing source domain data and a few labeled target samples.
We show that the proposed model significantly outperforms SOTA methods in terms of effectiveness and generalisability on SSDA datasets.
arXiv Detail & Related papers (2023-03-30T16:48:28Z) - Identifiable Latent Causal Content for Domain Adaptation under Latent Covariate Shift [82.14087963690561]
Multi-source domain adaptation (MSDA) addresses the challenge of learning a label prediction function for an unlabeled target domain.
We present an intricate causal generative model by introducing latent noises across domains, along with a latent content variable and a latent style variable.
The proposed approach showcases exceptional performance and efficacy on both simulated and real-world datasets.
arXiv Detail & Related papers (2022-08-30T11:25:15Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Self-Guided Adaptation: Progressive Representation Alignment for Domain
Adaptive Object Detection [86.69077525494106]
Unsupervised domain adaptation (UDA) has achieved unprecedented success in improving the cross-domain robustness of object detection models.
Existing UDA methods largely ignore the instantaneous data distribution during model learning, which could deteriorate the feature representation given large domain shift.
We propose a Self-Guided Adaptation (SGA) model, target at aligning feature representation and transferring object detection models across domains.
arXiv Detail & Related papers (2020-03-19T13:30:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.