Learning Domain-invariant Graph for Adaptive Semi-supervised Domain
Adaptation with Few Labeled Source Samples
- URL: http://arxiv.org/abs/2008.09359v1
- Date: Fri, 21 Aug 2020 08:13:25 GMT
- Title: Learning Domain-invariant Graph for Adaptive Semi-supervised Domain
Adaptation with Few Labeled Source Samples
- Authors: Jinfeng Li, Weifeng Liu, Yicong Zhou, Jun Yu, Dapeng Tao
- Abstract summary: Domain adaptation aims to generalize a model from a source domain to tackle tasks in a related but different target domain.
Traditional domain adaptation algorithms assume that enough labeled data, which are treated as the prior knowledge are available in the source domain.
We propose a Domain-invariant Graph Learning (DGL) approach for domain adaptation with only a few labeled source samples.
- Score: 65.55521019202557
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain adaptation aims to generalize a model from a source domain to tackle
tasks in a related but different target domain. Traditional domain adaptation
algorithms assume that enough labeled data, which are treated as the prior
knowledge are available in the source domain. However, these algorithms will be
infeasible when only a few labeled data exist in the source domain, and thus
the performance decreases significantly. To address this challenge, we propose
a Domain-invariant Graph Learning (DGL) approach for domain adaptation with
only a few labeled source samples. Firstly, DGL introduces the Nystrom method
to construct a plastic graph that shares similar geometric property as the
target domain. And then, DGL flexibly employs the Nystrom approximation error
to measure the divergence between plastic graph and source graph to formalize
the distribution mismatch from the geometric perspective. Through minimizing
the approximation error, DGL learns a domain-invariant geometric graph to
bridge source and target domains. Finally, we integrate the learned
domain-invariant graph with the semi-supervised learning and further propose an
adaptive semi-supervised model to handle the cross-domain problems. The results
of extensive experiments on popular datasets verify the superiority of DGL,
especially when only a few labeled source samples are available.
Related papers
- Multi-source Unsupervised Domain Adaptation on Graphs with Transferability Modeling [35.39202826643388]
We present the framework Selective Multi-source Adaptation for Graph (method), with a graph-modeling-based domain selector, a sub-graph node selector, and a bi-level alignment objective.
Results on five graph datasets show the effectiveness of the proposed method.
arXiv Detail & Related papers (2024-06-14T22:05:21Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Instance Relation Graph Guided Source-Free Domain Adaptive Object
Detection [79.89082006155135]
Unsupervised Domain Adaptation (UDA) is an effective approach to tackle the issue of domain shift.
UDA methods try to align the source and target representations to improve the generalization on the target domain.
The Source-Free Adaptation Domain (SFDA) setting aims to alleviate these concerns by adapting a source-trained model for the target domain without requiring access to the source data.
arXiv Detail & Related papers (2022-03-29T17:50:43Z) - Source Free Unsupervised Graph Domain Adaptation [60.901775859601685]
Unsupervised Graph Domain Adaptation (UGDA) shows its practical value of reducing the labeling cost for node classification.
Most existing UGDA methods heavily rely on the labeled graph in the source domain.
In some real-world scenarios, the source graph is inaccessible because of privacy issues.
We propose a novel scenario named Source Free Unsupervised Graph Domain Adaptation (SFUGDA)
arXiv Detail & Related papers (2021-12-02T03:18:18Z) - Discrepancy Minimization in Domain Generalization with Generative
Nearest Neighbors [13.047289562445242]
Domain generalization (DG) deals with the problem of domain shift where a machine learning model trained on multiple-source domains fail to generalize well on a target domain with different statistics.
Multiple approaches have been proposed to solve the problem of domain generalization by learning domain invariant representations across the source domains that fail to guarantee generalization on the shifted target domain.
We propose a Generative Nearest Neighbor based Discrepancy Minimization (GNNDM) method which provides a theoretical guarantee that is upper bounded by the error in the labeling process of the target.
arXiv Detail & Related papers (2020-07-28T14:54:25Z) - Supervised Domain Adaptation using Graph Embedding [86.3361797111839]
Domain adaptation methods assume that distributions between the two domains are shifted and attempt to realign them.
We propose a generic framework based on graph embedding.
We show that the proposed approach leads to a powerful Domain Adaptation framework.
arXiv Detail & Related papers (2020-03-09T12:25:13Z) - Do We Really Need to Access the Source Data? Source Hypothesis Transfer
for Unsupervised Domain Adaptation [102.67010690592011]
Unsupervised adaptationUDA (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.
Prior UDA methods typically require to access the source data when learning to adapt the model.
This work tackles a practical setting where only a trained source model is available and how we can effectively utilize such a model without source data to solve UDA problems.
arXiv Detail & Related papers (2020-02-20T03:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.