Joint Wasserstein Distribution Matching
- URL: http://arxiv.org/abs/2003.00389v1
- Date: Sun, 1 Mar 2020 03:39:00 GMT
- Title: Joint Wasserstein Distribution Matching
- Authors: JieZhang Cao, Langyuan Mo, Qing Du, Yong Guo, Peilin Zhao, Junzhou
Huang, Mingkui Tan
- Abstract summary: Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to match joint distributions of two domains, occurs in many machine learning and computer vision applications.
We propose to address JDM problem by minimizing the Wasserstein distance of the joint distributions in two domains.
We then propose an important theorem to reduce the intractable problem into a simple optimization problem, and develop a novel method to solve it.
- Score: 89.86721884036021
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Joint distribution matching (JDM) problem, which aims to learn bidirectional
mappings to match joint distributions of two domains, occurs in many machine
learning and computer vision applications. This problem, however, is very
difficult due to two critical challenges: (i) it is often difficult to exploit
sufficient information from the joint distribution to conduct the matching;
(ii) this problem is hard to formulate and optimize. In this paper, relying on
optimal transport theory, we propose to address JDM problem by minimizing the
Wasserstein distance of the joint distributions in two domains. However, the
resultant optimization problem is still intractable. We then propose an
important theorem to reduce the intractable problem into a simple optimization
problem, and develop a novel method (called Joint Wasserstein Distribution
Matching (JWDM)) to solve it. In the experiments, we apply our method to
unsupervised image translation and cross-domain video synthesis. Both
qualitative and quantitative comparisons demonstrate the superior performance
of our method over several state-of-the-arts.
Related papers
- Partial Distribution Matching via Partial Wasserstein Adversarial Networks [35.48994933353969]
This paper studies the problem of distribution matching (DM), which is a fundamental machine learning problem seeking to robustly align two probability distributions.
Our approach is established on a relaxed formulation, called partial distribution matching (PDM), which seeks to match a fraction of the distributions instead of matching them completely.
Experiment results confirm that the proposed PWAN effectively produces highly robust matching results, performing better or on par with the state-of-the-art methods.
arXiv Detail & Related papers (2024-09-16T17:41:45Z) - Efficient Alternating Minimization Solvers for Wyner Multi-View
Unsupervised Learning [0.0]
We propose two novel formulations that enable the development of computational efficient solvers based the alternating principle.
The proposed solvers offer computational efficiency, theoretical convergence guarantees, local minima complexity with the number of views, and exceptional accuracy as compared with the state-of-the-art techniques.
arXiv Detail & Related papers (2023-03-28T10:17:51Z) - Cooperative Distribution Alignment via JSD Upper Bound [7.071749623370137]
Unsupervised distribution alignment estimates a transformation that maps two or more source distributions to a shared aligned distribution.
This task has many applications including generative modeling, unsupervised domain adaptation, and socially aware learning.
We propose to unify and generalize previous flow-based approaches under a single non-adversarial framework.
arXiv Detail & Related papers (2022-07-05T20:09:03Z) - Compression and Data Similarity: Combination of Two Techniques for
Communication-Efficient Solving of Distributed Variational Inequalities [137.6408511310322]
In this paper we consider a combination of two popular approaches: compression and data similarity.
We show that this synergy can be more effective than each of the approaches separately in solving distributed smooth strongly monotonic variational inequalities.
arXiv Detail & Related papers (2022-06-19T16:38:56Z) - Distributed Methods with Compressed Communication for Solving
Variational Inequalities, with Theoretical Guarantees [115.08148491584997]
We present the first theoretically grounded distributed methods for solving variational inequalities and saddle point problems using compressed communication: MASHA1 and MASHA2.
New algorithms support bidirectional compressions, and also can be modified for setting with batches and for federated learning with partial participation of clients.
arXiv Detail & Related papers (2021-10-07T10:04:32Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Learning to Match Distributions for Domain Adaptation [116.14838935146004]
This paper proposes Learning to Match (L2M) to automatically learn the cross-domain distribution matching.
L2M reduces the inductive bias by using a meta-network to learn the distribution matching loss in a data-driven way.
Experiments on public datasets substantiate the superiority of L2M over SOTA methods.
arXiv Detail & Related papers (2020-07-17T03:26:13Z) - MineReduce: an approach based on data mining for problem size reduction [58.720142291102135]
This paper presents an approach named MineReduce, which uses mined patterns to perform problem size reduction.
We present an application of MineReduce to improve a for the heterogeneous fleet vehicle routing problem.
arXiv Detail & Related papers (2020-05-15T08:49:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.