Sparse Partial Optimal Transport via Quadratic Regularization
- URL: http://arxiv.org/abs/2508.08476v1
- Date: Mon, 11 Aug 2025 21:22:35 GMT
- Title: Sparse Partial Optimal Transport via Quadratic Regularization
- Authors: Khang Tran, Khoa Nguyen, Anh Nguyen, Thong Huynh, Son Pham, Sy-Hoang Nguyen-Dang, Manh Pham, Bang Vo, Mai Ngoc Tran, Mai Ngoc Tran, Dung Luong,
- Abstract summary: Partial Optimal Transport (POT) has emerged as a central tool in various Machine Learning (ML) applications.<n>We propose a novel formulation of POT with quadratic regularization, hence termed quadratic regularized POT (QPOT)
- Score: 3.244176245288102
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Partial Optimal Transport (POT) has recently emerged as a central tool in various Machine Learning (ML) applications. It lifts the stringent assumption of the conventional Optimal Transport (OT) that input measures are of equal masses, which is often not guaranteed in real-world datasets, and thus offers greater flexibility by permitting transport between unbalanced input measures. Nevertheless, existing major solvers for POT commonly rely on entropic regularization for acceleration and thus return dense transport plans, hindering the adoption of POT in various applications that favor sparsity. In this paper, as an alternative approach to the entropic POT formulation in the literature, we propose a novel formulation of POT with quadratic regularization, hence termed quadratic regularized POT (QPOT), which induces sparsity to the transport plan and consequently facilitates the adoption of POT in many applications with sparsity requirements. Extensive experiments on synthetic and CIFAR-10 datasets, as well as real-world applications such as color transfer and domain adaptations, consistently demonstrate the improved sparsity and favorable performance of our proposed QPOT formulation.
Related papers
- Hyperparameter Trajectory Inference with Conditional Lagrangian Optimal Transport [51.56484100374058]
Post-deployment, user preferences can evolve, making initial settings undesirable.<n>We learn, from observed data, how a NN's conditional output distribution changes with its hyper parameters.<n>We construct a surrogate model that approximates the NN at unobserved hyper parameters.
arXiv Detail & Related papers (2026-03-02T11:55:02Z) - Move What Matters: Parameter-Efficient Domain Adaptation via Optimal Transport Flow for Collaborative Perception [8.774658029766988]
FlowAdapt is a parameter-efficient framework grounded in optimal transport theory.<n>We introduce a Wasserstein Greedy Sampling strategy to selectively filter redundant samples.<n> Progressive Knowledge Transfer module is designed to inject compressed early-stage representations into later stages.
arXiv Detail & Related papers (2026-02-12T04:36:50Z) - Variational Entropic Optimal Transport [67.76725267984578]
We propose Variational Entropic Optimal Transport (VarEOT) for domain translation problems.<n>VarEOT is based on an exact variational reformulation of the log-partition $log mathbbE[exp(cdot)$ as a tractable generalization over an auxiliary positive normalizer.<n> Experiments on synthetic data and unpaired image-to-image translation demonstrate competitive or improved translation quality.
arXiv Detail & Related papers (2026-02-02T15:48:44Z) - Implicit Reward as the Bridge: A Unified View of SFT and DPO Connections [65.36449542323277]
We present a unified theoretical framework bridgingSupervised Fine-Tuning (SFT) and preference learning in Large Language Model (LLM) post-training.<n>We propose a simple yet effective learning rate reduction approach that yields significant performance improvements.
arXiv Detail & Related papers (2025-06-15T05:42:29Z) - Optimal Transport Adapter Tuning for Bridging Modality Gaps in Few-Shot Remote Sensing Scene Classification [80.83325513157637]
Few-Shot Remote Sensing Scene Classification (FS-RSSC) presents the challenge of classifying remote sensing images with limited labeled samples.<n>We propose a novel Optimal Transport Adapter Tuning (OTAT) framework aimed at constructing an ideal Platonic representational space.
arXiv Detail & Related papers (2025-03-19T07:04:24Z) - Expected Sliced Transport Plans [9.33181953215826]
We propose a "lifting" operation to extend one-dimensional optimal transport plans back to the original space of the measures.
We prove that using the EST plan to weight the sum of the individual Euclidean costs for moving from one point to another results in a valid metric between the input discrete probability measures.
arXiv Detail & Related papers (2024-10-16T02:44:36Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Optimal Transport with Adaptive Regularisation [14.919246099820548]
Regularising the primal formulation of optimal transport (OT) with a strictly convex term leads to enhanced numerical complexity and a denser transport plan.
We introduce OT with Adaptive RegularIsation (OTARI), a new formulation of OT that imposes constraints on the mass going in or/and out of each point.
arXiv Detail & Related papers (2023-10-04T16:05:36Z) - Low-rank Optimal Transport: Approximation, Statistics and Debiasing [51.50788603386766]
Low-rank optimal transport (LOT) approach advocated in citescetbon 2021lowrank
LOT is seen as a legitimate contender to entropic regularization when compared on properties of interest.
We target each of these areas in this paper in order to cement the impact of low-rank approaches in computational OT.
arXiv Detail & Related papers (2022-05-24T20:51:37Z) - BoMb-OT: On Batch of Mini-batches Optimal Transport [23.602237930502948]
Mini-batch optimal transport (m-OT) has been successfully used in practical applications that involve probability measures with intractable density.
We propose a novel mini-batching scheme for optimal transport, named Batch of Mini-batches Optimal Transport (BoMb-OT)
We show that the new mini-batching scheme can estimate a better transportation plan between two original measures than m-OT.
arXiv Detail & Related papers (2021-02-11T09:56:25Z) - Comparing Probability Distributions with Conditional Transport [63.11403041984197]
We propose conditional transport (CT) as a new divergence and approximate it with the amortized CT (ACT) cost.
ACT amortizes the computation of its conditional transport plans and comes with unbiased sample gradients that are straightforward to compute.
On a wide variety of benchmark datasets generative modeling, substituting the default statistical distance of an existing generative adversarial network with ACT is shown to consistently improve the performance.
arXiv Detail & Related papers (2020-12-28T05:14:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.