Optimization on manifolds: A symplectic approach
- URL: http://arxiv.org/abs/2107.11231v2
- Date: Tue, 4 Jul 2023 16:46:12 GMT
- Title: Optimization on manifolds: A symplectic approach
- Authors: Guilherme Fran\c{c}a, Alessandro Barp, Mark Girolami, Michael I.
Jordan
- Abstract summary: We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
- Score: 127.54402681305629
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optimization tasks are crucial in statistical machine learning. Recently,
there has been great interest in leveraging tools from dynamical systems to
derive accelerated and robust optimization methods via suitable discretizations
of continuous-time systems. However, these ideas have mostly been limited to
Euclidean spaces and unconstrained settings, or to Riemannian gradient flows.
In this work, we propose a dissipative extension of Dirac's theory of
constrained Hamiltonian systems as a general framework for solving optimization
problems over smooth manifolds, including problems with nonlinear constraints.
We develop geometric/symplectic numerical integrators on manifolds that are
"rate-matching," i.e., preserve the continuous-time rates of convergence. In
particular, we introduce a dissipative RATTLE integrator able to achieve
optimal convergence rate locally. Our class of (accelerated) algorithms are not
only simple and efficient but also applicable to a broad range of contexts.
Related papers
- A Novel Unified Parametric Assumption for Nonconvex Optimization [53.943470475510196]
Non optimization is central to machine learning, but the general framework non convexity enables weak convergence guarantees too pessimistic compared to the other hand.
We introduce a novel unified assumption in non convex algorithms.
arXiv Detail & Related papers (2025-02-17T21:25:31Z) - Stochastic interior-point methods for smooth conic optimization with applications [3.294420397461204]
We introduce an interior-point method for general conic optimization, along with four novel SIPM variants.
Under underdeveloped assumptions, we establish the global convergence rates of our proposed SIPMs.
Experiments on robust linear regression, multi-task relationship learning, and clustering data streams demonstrate the effectiveness of our approach.
arXiv Detail & Related papers (2024-12-17T15:06:44Z) - Go With the Flow: Fast Diffusion for Gaussian Mixture Models [13.03355083378673]
Schr"odinger Bridges (SB) are diffusion processes that steer, in finite time, a given initial distribution to another final one while minimizing a suitable cost functional.
We propose latentmetrization of a set of SB policies for steering a system from one distribution to another.
We showcase the potential this approach in low-to-dimensional problems such as image-to-image translation in the space of an autoencoder.
arXiv Detail & Related papers (2024-12-12T08:40:22Z) - Structured Regularization for Constrained Optimization on the SPD Manifold [1.1126342180866644]
We introduce a class of structured regularizers, based on symmetric gauge functions, which allow for solving constrained optimization on the SPD manifold with faster unconstrained methods.
We show that our structured regularizers can be chosen to preserve or induce desirable structure, in particular convexity and "difference of convex" structure.
arXiv Detail & Related papers (2024-10-12T22:11:22Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Breaking the Convergence Barrier: Optimization via Fixed-Time Convergent
Flows [4.817429789586127]
We introduce a Poly-based optimization framework for achieving acceleration, based on the notion of fixed-time stability dynamical systems.
We validate the accelerated convergence properties of the proposed schemes on a range of numerical examples against the state-of-the-art optimization algorithms.
arXiv Detail & Related papers (2021-12-02T16:04:40Z) - Faster Algorithm and Sharper Analysis for Constrained Markov Decision
Process [56.55075925645864]
The problem of constrained decision process (CMDP) is investigated, where an agent aims to maximize the expected accumulated discounted reward subject to multiple constraints.
A new utilities-dual convex approach is proposed with novel integration of three ingredients: regularized policy, dual regularizer, and Nesterov's gradient descent dual.
This is the first demonstration that nonconcave CMDP problems can attain the lower bound of $mathcal O (1/epsilon)$ for all complexity optimization subject to convex constraints.
arXiv Detail & Related papers (2021-10-20T02:57:21Z) - On Constraints in First-Order Optimization: A View from Non-Smooth
Dynamical Systems [99.59934203759754]
We introduce a class of first-order methods for smooth constrained optimization.
Two distinctive features of our approach are that projections or optimizations over the entire feasible set are avoided.
The resulting algorithmic procedure is simple to implement even when constraints are nonlinear.
arXiv Detail & Related papers (2021-07-17T11:45:13Z) - On dissipative symplectic integration with applications to
gradient-based optimization [77.34726150561087]
We propose a geometric framework in which discretizations can be realized systematically.
We show that a generalization of symplectic to nonconservative and in particular dissipative Hamiltonian systems is able to preserve rates of convergence up to a controlled error.
arXiv Detail & Related papers (2020-04-15T00:36:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.