Revisiting the Role of Euler Numerical Integration on Acceleration and
Stability in Convex Optimization
- URL: http://arxiv.org/abs/2102.11537v1
- Date: Tue, 23 Feb 2021 08:05:05 GMT
- Title: Revisiting the Role of Euler Numerical Integration on Acceleration and
Stability in Convex Optimization
- Authors: Peiyuan Zhang, Antonio Orvieto, Hadi Daneshmand, Thomas Hofmann, Roy
Smith
- Abstract summary: We propose a novel ordinary differential equation that questions the connection between acceleration and the quality of the integrator.
Although semi-implicit methods are well-known in numerical analysis to enjoy many desirable features for the integration of physical systems, our findings show that these properties do not necessarily relate to acceleration.
- Score: 25.171367662119113
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Viewing optimization methods as numerical integrators for ordinary
differential equations (ODEs) provides a thought-provoking modern framework for
studying accelerated first-order optimizers. In this literature, acceleration
is often supposed to be linked to the quality of the integrator (accuracy,
energy preservation, symplecticity). In this work, we propose a novel ordinary
differential equation that questions this connection: both the explicit and the
semi-implicit (a.k.a symplectic) Euler discretizations on this ODE lead to an
accelerated algorithm for convex programming. Although semi-implicit methods
are well-known in numerical analysis to enjoy many desirable features for the
integration of physical systems, our findings show that these properties do not
necessarily relate to acceleration.
Related papers
- PINNIES: An Efficient Physics-Informed Neural Network Framework to Integral Operator Problems [0.0]
This paper introduces an efficient tensor-vector product technique for the approximation of integral operators within physics-informed deep learning frameworks.
We demonstrate the applicability of this method to both Fredholm and Volterra integral operators.
We also propose a fast matrix-vector product algorithm for efficiently computing the fractional Caputo derivative.
arXiv Detail & Related papers (2024-09-03T13:43:58Z) - ODE-based Learning to Optimize [28.380622776436905]
We present a comprehensive framework integrating the inertial systems with Hessian-driven damping equation (ISHD)
We formulate a novel learning to optimize (L2O) problem aimed at minimizing the stopping time subject to the convergence and stability condition.
Empirical validation of our framework is conducted through extensive numerical experiments across a diverse set of optimization problems.
arXiv Detail & Related papers (2024-06-04T06:39:45Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - On the connections between optimization algorithms, Lyapunov functions, and differential equations: theory and insights [0.0]
We revisit the framework introduced by Fazylab et al. to construct Lyapunov functions for optimization algorithms in discrete and continuous time.
For smooth, strongly convex objective functions, we relax the requirements necessary for such a construction.
We prove convergence rates that improve on those available in the literature.
arXiv Detail & Related papers (2023-05-15T14:03:16Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - A Contraction Theory Approach to Optimization Algorithms from
Acceleration Flows [1.90365714903665]
We use contraction theory to provide a principled methodology to design and discretize appropriate ODEs.
We propose a novel system of ODEs, namely the Accelerated-Contracting-Nesterov flow.
Remarkably, a simple explicit Euler discretization of this flow corresponds to the Nesterov acceleration method.
arXiv Detail & Related papers (2021-05-18T21:11:37Z) - Acceleration Methods [57.202881673406324]
We first use quadratic optimization problems to introduce two key families of acceleration methods.
We discuss momentum methods in detail, starting with the seminal work of Nesterov.
We conclude by discussing restart schemes, a set of simple techniques for reaching nearly optimal convergence rates.
arXiv Detail & Related papers (2021-01-23T17:58:25Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.