Improving Neural Optimal Transport via Displacement Interpolation
- URL: http://arxiv.org/abs/2410.03783v2
- Date: Mon, 21 Oct 2024 20:54:04 GMT
- Title: Improving Neural Optimal Transport via Displacement Interpolation
- Authors: Jaemoo Choi, Yongxin Chen, Jaewoong Choi,
- Abstract summary: Optimal Transport (OT) theory investigates the cost-minimizing transport map that moves a source distribution to a target distribution.
We propose a novel method to improve stability and achieve a better approximation of the OT Map by exploiting displacement.
We demonstrate that DIOTM outperforms existing OT-based models on image-to-image translation tasks.
- Score: 16.474572112062535
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Optimal Transport (OT) theory investigates the cost-minimizing transport map that moves a source distribution to a target distribution. Recently, several approaches have emerged for learning the optimal transport map for a given cost function using neural networks. We refer to these approaches as the OT Map. OT Map provides a powerful tool for diverse machine learning tasks, such as generative modeling and unpaired image-to-image translation. However, existing methods that utilize max-min optimization often experience training instability and sensitivity to hyperparameters. In this paper, we propose a novel method to improve stability and achieve a better approximation of the OT Map by exploiting displacement interpolation, dubbed Displacement Interpolation Optimal Transport Model (DIOTM). We derive the dual formulation of displacement interpolation at specific time $t$ and prove how these dual problems are related across time. This result allows us to utilize the entire trajectory of displacement interpolation in learning the OT Map. Our method improves the training stability and achieves superior results in estimating optimal transport maps. We demonstrate that DIOTM outperforms existing OT-based models on image-to-image translation tasks.
Related papers
- Unlearning as multi-task optimization: A normalized gradient difference approach with an adaptive learning rate [105.86576388991713]
We introduce a normalized gradient difference (NGDiff) algorithm, enabling us to have better control over the trade-off between the objectives.
We provide a theoretical analysis and empirically demonstrate the superior performance of NGDiff among state-of-the-art unlearning methods on the TOFU and MUSE datasets.
arXiv Detail & Related papers (2024-10-29T14:41:44Z) - Optimal Transport on the Lie Group of Roto-translations [2.03742455046876]
We develop a computational framework for optimal transportation over Lie groups, with a special focus on SE2.
We make several theoretical contributions (generalizable to matrix Lie groups)
We develop a Sinkhorn like algorithm that can be efficiently implemented using fast and accurate distance approximations of the Lie group and GPU-friendly group convolutions.
arXiv Detail & Related papers (2024-02-23T13:40:34Z) - Efficient Neural Network Approaches for Conditional Optimal Transport with Applications in Bayesian Inference [1.740133468405535]
We present two neural network approaches that approximate the solutions of static and conditional optimal transport (COT) problems.
We demonstrate both algorithms, comparing them with competing state-the-art approaches, using benchmark datasets and simulation-based inverse problems.
arXiv Detail & Related papers (2023-10-25T20:20:09Z) - Analyzing and Improving Optimal-Transport-based Adversarial Networks [9.980822222343921]
Optimal Transport (OT) problem aims to find a transport plan that bridges two distributions while minimizing a given cost function.
OT theory has been widely utilized in generative modeling.
Our approach achieves a FID score of 2.51 on CIFAR-10 and 5.99 on CelebA-HQ-256, outperforming unified OT-based adversarial approaches.
arXiv Detail & Related papers (2023-10-04T06:52:03Z) - Generative Modeling through the Semi-dual Formulation of Unbalanced
Optimal Transport [9.980822222343921]
We propose a novel generative model based on the semi-dual formulation of Unbalanced Optimal Transport (UOT)
Unlike OT, UOT relaxes the hard constraint on distribution matching. This approach provides better robustness against outliers, stability during training, and faster convergence.
Our model outperforms existing OT-based generative models, achieving FID scores of 2.97 on CIFAR-10 and 6.36 on CelebA-HQ-256.
arXiv Detail & Related papers (2023-05-24T06:31:05Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Unpaired Image Super-Resolution with Optimal Transport Maps [128.1189695209663]
Real-world image super-resolution (SR) tasks often do not have paired datasets limiting the application of supervised techniques.
We propose an algorithm for unpaired SR which learns an unbiased OT map for the perceptual transport cost.
Our algorithm provides nearly state-of-the-art performance on the large-scale unpaired AIM-19 dataset.
arXiv Detail & Related papers (2022-02-02T16:21:20Z) - Neural Optimal Transport [82.2689844201373]
We present a novel neural-networks-based algorithm to compute optimal transport maps and plans for strong and weak transport costs.
We prove that neural networks are universal approximators of transport plans between probability distributions.
arXiv Detail & Related papers (2022-01-28T16:24:13Z) - Generative Modeling with Optimal Transport Maps [83.59805931374197]
Optimal Transport (OT) has become a powerful tool for large-scale generative modeling tasks.
We show that the OT map itself can be used as a generative model, providing comparable performance.
arXiv Detail & Related papers (2021-10-06T18:17:02Z) - Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2
Benchmark [133.46066694893318]
We evaluate the performance of neural network-based solvers for optimal transport.
We find that existing solvers do not recover optimal transport maps even though they perform well in downstream tasks.
arXiv Detail & Related papers (2021-06-03T15:59:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.