HoMM: Higher-order Moment Matching for Unsupervised Domain Adaptation
- URL: http://arxiv.org/abs/1912.11976v1
- Date: Fri, 27 Dec 2019 03:53:03 GMT
- Title: HoMM: Higher-order Moment Matching for Unsupervised Domain Adaptation
- Authors: Chao Chen, Zhihang Fu, Zhihong Chen, Sheng Jin, Zhaowei Cheng, Xinyu
Jin, Xian-Sheng Hua
- Abstract summary: Minimizing the discrepancy of feature distributions between different domains is one of the most promising directions in unsupervised domain adaptation.
We propose a Higher-order Moment Matching (HoMM) method, and further extend the HoMM into reproducing kernel Hilbert spaces (RKHS)
Our proposed HoMM consistently outperforms the existing moment matching methods by a large margin.
- Score: 40.04775844604928
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Minimizing the discrepancy of feature distributions between different domains
is one of the most promising directions in unsupervised domain adaptation. From
the perspective of distribution matching, most existing discrepancy-based
methods are designed to match the second-order or lower statistics, which
however, have limited expression of statistical characteristic for non-Gaussian
distributions. In this work, we explore the benefits of using higher-order
statistics (mainly refer to third-order and fourth-order statistics) for domain
matching. We propose a Higher-order Moment Matching (HoMM) method, and further
extend the HoMM into reproducing kernel Hilbert spaces (RKHS). In particular,
our proposed HoMM can perform arbitrary-order moment tensor matching, we show
that the first-order HoMM is equivalent to Maximum Mean Discrepancy (MMD) and
the second-order HoMM is equivalent to Correlation Alignment (CORAL). Moreover,
the third-order and the fourth-order moment tensor matching are expected to
perform comprehensive domain alignment as higher-order statistics can
approximate more complex, non-Gaussian distributions. Besides, we also exploit
the pseudo-labeled target samples to learn discriminative representations in
the target domain, which further improves the transfer performance. Extensive
experiments are conducted, showing that our proposed HoMM consistently
outperforms the existing moment matching methods by a large margin. Codes are
available at \url{https://github.com/chenchao666/HoMM-Master}
Related papers
- Guidance Not Obstruction: A Conjugate Consistent Enhanced Strategy for Domain Generalization [50.04665252665413]
We argue that acquiring discriminative generalization between classes within domains is crucial.
In contrast to seeking distribution alignment, we endeavor to safeguard domain-related between-class discrimination.
We employ a novel distribution-level Universum strategy to generate supplementary diverse domain-related class-conditional distributions.
arXiv Detail & Related papers (2024-12-13T12:25:16Z) - Clustering-Based Validation Splits for Model Selection under Domain Shift [0.0]
It is proposed that the training-validation split should maximise the distribution mismatch between the two sets.
A constrained clustering algorithm, which leverages linear programming to control the size, label, and (optionally) group distributions of the splits, is presented.
arXiv Detail & Related papers (2024-05-29T19:21:17Z) - Batch and match: black-box variational inference with a score-based divergence [26.873037094654826]
We propose batch and match (BaM) as an alternative approach to blackbox variational inference (BBVI) based on a score-based divergence.
We show that BaM converges in fewer evaluations than leading implementations of BBVI based on ELBO.
arXiv Detail & Related papers (2024-02-22T18:20:22Z) - SGMM: Stochastic Approximation to Generalized Method of Moments [8.48870560391056]
We introduce a new class of algorithms, Generalized Method of Moments (SGMM) for estimation and inference on (overidentified) moment restriction models.
Our SGMM is a novel approximation to the popular Hansen (1982) (offline) GMM, and offers fast and scalable implementation with the ability to handle streaming datasets in real time.
arXiv Detail & Related papers (2023-08-25T00:22:45Z) - Asymmetric Transfer Hashing with Adaptive Bipartite Graph Learning [95.54688542786863]
Existing hashing methods assume that the query and retrieval samples lie in homogeneous feature space within the same domain.
We propose an Asymmetric Transfer Hashing (ATH) framework with its unsupervised/semi-supervised/supervised realizations.
By jointly optimizing asymmetric hash functions and the bipartite graph, not only can knowledge transfer be achieved but information loss caused by feature alignment can also be avoided.
arXiv Detail & Related papers (2022-06-25T08:24:34Z) - Meta-Learning Adversarial Bandits [49.094361442409785]
We study online learning with bandit feedback across multiple tasks, with the goal of improving average performance across tasks if they are similar according to some natural task-similarity measure.
As the first to target the adversarial setting, we design a meta-algorithm that setting-specific guarantees for two important cases: multi-armed bandits (MAB) and bandit optimization (BLO)
Our guarantees rely on proving that unregularized follow-the-leader combined with multiplicative weights is enough to online learn a non-smooth and non-B sequence.
arXiv Detail & Related papers (2022-05-27T17:40:32Z) - Exact Feature Distribution Matching for Arbitrary Style Transfer and
Domain Generalization [43.19170120544387]
We propose to perform Exact Feature Distribution Matching (EFDM) by exactly matching the empirical Cumulative Distribution Functions (eCDFs) of image features.
A fast EHM algorithm, named Sort-Matching, is employed to perform EFDM in a plug-and-play manner with minimal cost.
The effectiveness of our proposed EFDM method is verified on a variety of AST and DG tasks, demonstrating new state-of-the-art results.
arXiv Detail & Related papers (2022-03-15T09:18:14Z) - A Unified Joint Maximum Mean Discrepancy for Domain Adaptation [73.44809425486767]
This paper theoretically derives a unified form of JMMD that is easy to optimize.
From the revealed unified JMMD, we illustrate that JMMD degrades the feature-label dependence that benefits to classification.
We propose a novel MMD matrix to promote the dependence, and devise a novel label kernel that is robust to label distribution shift.
arXiv Detail & Related papers (2021-01-25T09:46:14Z) - DWMD: Dimensional Weighted Orderwise Moment Discrepancy for
Domain-specific Hidden Representation Matching [21.651807102769954]
Key challenge in this field is establishing a metric that can measure the data distribution discrepancy between two homogeneous domains.
We propose a novel moment-based probability distribution metric termed dimensional weighted orderwise moment discrepancy (DWMD) for feature representation matching.
Our metric function takes advantage of a series for high-order moment alignment, and we theoretically prove that our DWMD metric function is error-free.
arXiv Detail & Related papers (2020-07-18T02:37:32Z) - Learning to Match Distributions for Domain Adaptation [116.14838935146004]
This paper proposes Learning to Match (L2M) to automatically learn the cross-domain distribution matching.
L2M reduces the inductive bias by using a meta-network to learn the distribution matching loss in a data-driven way.
Experiments on public datasets substantiate the superiority of L2M over SOTA methods.
arXiv Detail & Related papers (2020-07-17T03:26:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.