Feature Robust Optimal Transport for High-dimensional Data
- URL: http://arxiv.org/abs/2005.12123v4
- Date: Tue, 29 Sep 2020 05:38:13 GMT
- Title: Feature Robust Optimal Transport for High-dimensional Data
- Authors: Mathis Petrovich and Chao Liang and Ryoma Sato and Yanbin Liu and
Yao-Hung Hubert Tsai and Linchao Zhu and Yi Yang and Ruslan Salakhutdinov and
Makoto Yamada
- Abstract summary: We propose feature-robust optimal transport (FROT) for high-dimensional data, which solves high-dimensional OT problems using feature selection to avoid the curse of dimensionality.
We show that the FROT algorithm achieves state-of-the-art performance in real-world semantic correspondence datasets.
- Score: 125.04654605998618
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimal transport is a machine learning problem with applications including
distribution comparison, feature selection, and generative adversarial
networks. In this paper, we propose feature-robust optimal transport (FROT) for
high-dimensional data, which solves high-dimensional OT problems using feature
selection to avoid the curse of dimensionality. Specifically, we find a
transport plan with discriminative features. To this end, we formulate the FROT
problem as a min--max optimization problem. We then propose a convex
formulation of the FROT problem and solve it using a Frank--Wolfe-based
optimization algorithm, whereby the subproblem can be efficiently solved using
the Sinkhorn algorithm. Since FROT finds the transport plan from selected
features, it is robust to noise features. To show the effectiveness of FROT, we
propose using the FROT algorithm for the layer selection problem in deep neural
networks for semantic correspondence. By conducting synthetic and benchmark
experiments, we demonstrate that the proposed method can find a strong
correspondence by determining important layers. We show that the FROT algorithm
achieves state-of-the-art performance in real-world semantic correspondence
datasets.
Related papers
- Discovering Preference Optimization Algorithms with and for Large Language Models [50.843710797024805]
offline preference optimization is a key method for enhancing and controlling the quality of Large Language Model (LLM) outputs.
We perform objective discovery to automatically discover new state-of-the-art preference optimization algorithms without (expert) human intervention.
Experiments demonstrate the state-of-the-art performance of DiscoPOP, a novel algorithm that adaptively blends logistic and exponential losses.
arXiv Detail & Related papers (2024-06-12T16:58:41Z) - DFWLayer: Differentiable Frank-Wolfe Optimization Layer [33.20550212188619]
Differentiable optimization has received a significant amount of attention due to its foundational role in the domain of machine learning based on neural networks.
This paper proposes a differentiable layer, named Differentiable Frank-Wolfe Layer (DFWLayer), by rolling out the Frank-Wolfe method.
Experimental results demonstrate that the DFWLayer not only attains competitive accuracy in solutions and gradients but also consistently adheres to constraints.
arXiv Detail & Related papers (2023-08-21T15:53:38Z) - Entropic Neural Optimal Transport via Diffusion Processes [105.34822201378763]
We propose a novel neural algorithm for the fundamental problem of computing the entropic optimal transport (EOT) plan between continuous probability distributions.
Our algorithm is based on the saddle point reformulation of the dynamic version of EOT which is known as the Schr"odinger Bridge problem.
In contrast to the prior methods for large-scale EOT, our algorithm is end-to-end and consists of a single learning step.
arXiv Detail & Related papers (2022-11-02T14:35:13Z) - Fast and computationally efficient generative adversarial network
algorithm for unmanned aerial vehicle-based network coverage optimization [1.2853186701496802]
The challenge of dynamic traffic demand in mobile networks is tackled by moving cells based on unmanned aerial vehicles.
Considering the tremendous potential of unmanned aerial vehicles in the future, we propose a new algorithm for coverage optimization.
The proposed algorithm is implemented based on a conditional generative adversarial neural network, with a unique multilayer sum-pooling loss function.
arXiv Detail & Related papers (2022-03-25T12:13:21Z) - Non-Gradient Manifold Neural Network [79.44066256794187]
Deep neural network (DNN) generally takes thousands of iterations to optimize via gradient descent.
We propose a novel manifold neural network based on non-gradient optimization.
arXiv Detail & Related papers (2021-06-15T06:39:13Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Efficient Robust Optimal Transport with Application to Multi-Label
Classification [12.521494095948068]
We model the feature-feature relationship via a symmetric positive semi-definite Mahalanobis metric in the OT cost function.
We view the resulting optimization problem as a non-linear OT problem, which we solve using the Frank-Wolfe algorithm.
Empirical results on the discriminative learning setting, such as tag prediction and multi-class classification, illustrate the good performance of our approach.
arXiv Detail & Related papers (2020-10-22T16:43:52Z) - COT-GAN: Generating Sequential Data via Causal Optimal Transport [4.588028371034406]
We introduce COT-GAN, an adversarial algorithm to train implicit generative models for producing sequential data.
The success of the algorithm also relies on a new, improved version of the Sinkhorn divergence which demonstrates less bias in learning.
arXiv Detail & Related papers (2020-06-15T17:37:15Z) - IVFS: Simple and Efficient Feature Selection for High Dimensional
Topology Preservation [33.424663018395684]
We propose a simple and effective feature selection algorithm to enhance sample similarity preservation.
The proposed algorithm is able to well preserve the pairwise distances, as well as topological patterns, of the full data.
arXiv Detail & Related papers (2020-04-02T23:05:00Z) - Differentiable Top-k Operator with Optimal Transport [135.36099648554054]
The SOFT top-k operator approximates the output of the top-k operation as the solution of an Entropic Optimal Transport (EOT) problem.
We apply the proposed operator to the k-nearest neighbors and beam search algorithms, and demonstrate improved performance.
arXiv Detail & Related papers (2020-02-16T04:57:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.