Computing high-dimensional optimal transport by flow neural networks
- URL: http://arxiv.org/abs/2305.11857v4
- Date: Sun, 4 Feb 2024 20:51:43 GMT
- Title: Computing high-dimensional optimal transport by flow neural networks
- Authors: Chen Xu, Xiuyuan Cheng, Yao Xie
- Abstract summary: This work develops a flow-based model that transports from $P$ to an arbitrary $Q$ where both distributions are only accessible via finite samples.
We propose to learn the dynamic optimal transport between $P$ and $Q$ by training a flow neural network.
The trained optimal transport flow subsequently allows for performing many downstream tasks, including infinitesimal density estimation (DRE) and distribution in the latent space for generative models.
- Score: 22.320632565424745
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Flow-based models are widely used in generative tasks, including normalizing
flow, where a neural network transports from a data distribution $P$ to a
normal distribution. This work develops a flow-based model that transports from
$P$ to an arbitrary $Q$ where both distributions are only accessible via finite
samples. We propose to learn the dynamic optimal transport between $P$ and $Q$
by training a flow neural network. The model is trained to optimally find an
invertible transport map between $P$ and $Q$ by minimizing the transport cost.
The trained optimal transport flow subsequently allows for performing many
downstream tasks, including infinitesimal density ratio estimation (DRE) and
distribution interpolation in the latent space for generative models. The
effectiveness of the proposed model on high-dimensional data is demonstrated by
strong empirical performance on high-dimensional DRE, OT baselines, and
image-to-image translation.
Related papers
- Improving Neural Optimal Transport via Displacement Interpolation [16.474572112062535]
Optimal Transport (OT) theory investigates the cost-minimizing transport map that moves a source distribution to a target distribution.
We propose a novel method to improve stability and achieve a better approximation of the OT Map by exploiting displacement.
We demonstrate that DIOTM outperforms existing OT-based models on image-to-image translation tasks.
arXiv Detail & Related papers (2024-10-03T16:42:23Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Normalizing flows as approximations of optimal transport maps via linear-control neural ODEs [49.1574468325115]
"Normalizing Flows" is related to the task of constructing invertible transport maps between probability measures by means of deep neural networks.
We consider the problem of recovering the $Wamma$-optimal transport map $T$ between absolutely continuous measures $mu,nuinmathcalP(mathbbRn)$ as the flow of a linear-control neural ODE.
arXiv Detail & Related papers (2023-11-02T17:17:03Z) - Flow-based Distributionally Robust Optimization [23.232731771848883]
We present a framework, called $textttFlowDRO$, for solving flow-based distributionally robust optimization (DRO) problems with Wasserstein uncertainty sets.
We aim to find continuous worst-case distribution (also called the Least Favorable Distribution, LFD) and sample from it.
We demonstrate its usage in adversarial learning, distributionally robust hypothesis testing, and a new mechanism for data-driven distribution perturbation differential privacy.
arXiv Detail & Related papers (2023-10-30T03:53:31Z) - Arbitrary Distributions Mapping via SyMOT-Flow: A Flow-based Approach Integrating Maximum Mean Discrepancy and Optimal Transport [2.7309692684728617]
We introduce a novel model called SyMOT-Flow that trains an invertible transformation by minimizing the symmetric maximum mean discrepancy between samples from two unknown distributions.
The resulting transformation leads to more stable and accurate sample generation.
arXiv Detail & Related papers (2023-08-26T08:39:16Z) - Generative Modeling through the Semi-dual Formulation of Unbalanced
Optimal Transport [9.980822222343921]
We propose a novel generative model based on the semi-dual formulation of Unbalanced Optimal Transport (UOT)
Unlike OT, UOT relaxes the hard constraint on distribution matching. This approach provides better robustness against outliers, stability during training, and faster convergence.
Our model outperforms existing OT-based generative models, achieving FID scores of 2.97 on CIFAR-10 and 6.36 on CelebA-HQ-256.
arXiv Detail & Related papers (2023-05-24T06:31:05Z) - GeONet: a neural operator for learning the Wasserstein geodesic [13.468026138183623]
We present GeONet, a mesh-invariant deep neural operator network that learns the non-linear mapping from the input pair of initial and terminal distributions to the Wasserstein geodesic connecting the two endpoint distributions.
We demonstrate that GeONet achieves comparable testing accuracy to the standard OT solvers on simulation examples and the MNIST dataset with considerably reduced inference-stage computational cost by orders of magnitude.
arXiv Detail & Related papers (2022-09-28T21:55:40Z) - Learning Optimal Transport Between two Empirical Distributions with
Normalizing Flows [12.91637880428221]
We propose to leverage the flexibility of neural networks to learn an approximate optimal transport map.
We show that a particular instance of invertible neural networks, namely the normalizing flows, can be used to approximate the solution of this OT problem.
arXiv Detail & Related papers (2022-07-04T08:08:47Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Neural Optimal Transport [82.2689844201373]
We present a novel neural-networks-based algorithm to compute optimal transport maps and plans for strong and weak transport costs.
We prove that neural networks are universal approximators of transport plans between probability distributions.
arXiv Detail & Related papers (2022-01-28T16:24:13Z) - Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2
Benchmark [133.46066694893318]
We evaluate the performance of neural network-based solvers for optimal transport.
We find that existing solvers do not recover optimal transport maps even though they perform well in downstream tasks.
arXiv Detail & Related papers (2021-06-03T15:59:28Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z) - TrajectoryNet: A Dynamic Optimal Transport Network for Modeling Cellular
Dynamics [74.43710101147849]
We present TrajectoryNet, which controls the continuous paths taken between distributions to produce dynamic optimal transport.
We show how this is particularly applicable for studying cellular dynamics in data from single-cell RNA sequencing (scRNA-seq) technologies.
arXiv Detail & Related papers (2020-02-09T21:00:38Z) - Deep Residual Flow for Out of Distribution Detection [27.218308616245164]
We present a novel approach that improves upon the state-of-the-art by leveraging an expressive density model based on normalizing flows.
We demonstrate the effectiveness of our method in ResNet and DenseNet architectures trained on various image datasets.
arXiv Detail & Related papers (2020-01-15T16:38:47Z) - Discriminator optimal transport [6.624726878647543]
We show that discriminator optimization process increases a lower bound of the dual cost function for the Wasserstein distance between the target distribution $p$ and the generator distribution $p_G$.
It implies that the trained discriminator can approximate optimal transport (OT) from $p_G$ to $p$.
We show that it improves inception score and FID calculated by un-conditional GAN trained by CIFAR-10, STL-10 and a public pre-trained model of conditional GAN by ImageNet.
arXiv Detail & Related papers (2019-10-15T14:47:37Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.