A Particle-Based Algorithm for Distributional Optimization on
\textit{Constrained Domains} via Variational Transport and Mirror Descent
- URL: http://arxiv.org/abs/2208.00587v3
- Date: Wed, 3 Aug 2022 15:58:10 GMT
- Title: A Particle-Based Algorithm for Distributional Optimization on
\textit{Constrained Domains} via Variational Transport and Mirror Descent
- Authors: Dai Hai Nguyen, Tetsuya Sakurai
- Abstract summary: We consider the problem of minimizing an objective functional, which admits a variational form and is defined over probability distributions on the constrained domain.
Inspired by the mirror descent algorithm for constrained optimization, we propose an iterative particle-based algorithm, named Mirrored Variational Transport (mirrorVT)
We demonstrate the effectiveness of mirrorVT for minimizing the functionals over probability distributions on the simplex- and Euclidean ball-constrained domains.
- Score: 4.835289158553091
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We consider the optimization problem of minimizing an objective functional,
which admits a variational form and is defined over probability distributions
on the constrained domain, which poses challenges to both theoretical analysis
and algorithmic design. Inspired by the mirror descent algorithm for
constrained optimization, we propose an iterative particle-based algorithm,
named Mirrored Variational Transport (mirrorVT), extended from the Variational
Transport framework [7] for dealing with the constrained domain. In particular,
for each iteration, mirrorVT maps particles to an unconstrained dual domain
induced by a mirror map and then approximately perform Wasserstein gradient
descent on the manifold of distributions defined over the dual space by pushing
particles. At the end of iteration, particles are mapped back to the original
constrained domain. Through simulated experiments, we demonstrate the
effectiveness of mirrorVT for minimizing the functionals over probability
distributions on the simplex- and Euclidean ball-constrained domains. We also
analyze its theoretical properties and characterize its convergence to the
global minimum of the objective functional.
Related papers
- Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Reflected Schr\"odinger Bridge for Constrained Generative Modeling [16.72888494254555]
Reflected diffusion models have become the go-to method for large-scale generative models in real-world applications.
We introduce the Reflected Schrodinger Bridge algorithm: an entropy-regularized optimal transport approach tailored generating data within diverse bounded domains.
Our algorithm yields robust generative modeling in diverse domains, and its scalability is demonstrated in real-world constrained generative modeling through standard image benchmarks.
arXiv Detail & Related papers (2024-01-06T14:39:58Z) - Moreau-Yoshida Variational Transport: A General Framework For Solving Regularized Distributional Optimization Problems [3.038642416291856]
We consider a general optimization problem of minimizing a composite objective functional defined over a class probability distributions.
We propose a novel method, dubbed as Moreau-Yoshida Variational Transport (MYVT), for solving the regularized distributional optimization problem.
arXiv Detail & Related papers (2023-07-31T01:14:42Z) - Efficient displacement convex optimization with particle gradient
descent [57.88860627977882]
Particle gradient descent is widely used to optimize functions of probability measures.
This paper considers particle gradient descent with a finite number of particles and establishes its theoretical guarantees to optimize functions that are emphdisplacement convex in measures.
arXiv Detail & Related papers (2023-02-09T16:35:59Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Constrained Maximum Cross-Domain Likelihood for Domain Generalization [14.91361835243516]
Domain generalization aims to learn a generalizable model on multiple source domains, which is expected to perform well on unseen test domains.
In this paper, we propose a novel domain generalization method, which minimizes the KL-divergence between posterior distributions from different domains.
Experiments on four standard benchmark datasets, i.e., Digits-DG, PACS, Office-Home and miniDomainNet, highlight the superior performance of our method.
arXiv Detail & Related papers (2022-10-09T03:41:02Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Efficient constrained sampling via the mirror-Langevin algorithm [9.061408029414455]
We propose a new discretization of the mirror-Langevin diffusion and give a crisp proof of its convergence.
For the task of sampling from a log-concave distribution supported on a compact set, our theoretical results are significantly better than the existing guarantees.
arXiv Detail & Related papers (2020-10-30T11:54:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.