Horoballs and the subgradient method
- URL: http://arxiv.org/abs/2403.15749v2
- Date: Tue, 2 Apr 2024 18:11:09 GMT
- Title: Horoballs and the subgradient method
- Authors: Adrian S. Lewis, Genaro Lopez-Acedo, Adriana Nicolae,
- Abstract summary: We consider convex optimization on Hadamard spaces in the style of a subgradient algorithm.
Our iteration applies in a general Hadamard space, is framed in the underlying space itself, and relies on horospherical convexity of the objective level sets.
We illustrate our subgradient algorithm on the minimal enclosing ball problem in Hadamard spaces.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To explore convex optimization on Hadamard spaces, we consider an iteration in the style of a subgradient algorithm. Traditionally, such methods assume that the underlying spaces are manifolds and that the objectives are geodesically convex: the methods are described using tangent spaces and exponential maps. By contrast, our iteration applies in a general Hadamard space, is framed in the underlying space itself, and relies instead on horospherical convexity of the objective level sets. For this restricted class of objectives, we prove a complexity result of the usual form. Notably, the complexity does not depend on a lower bound on the space curvature. We illustrate our subgradient algorithm on the minimal enclosing ball problem in Hadamard spaces.
Related papers
- Topological Trajectory Classification and Landmark Inference on Simplicial Complexes [5.03315505352304]
We consider the problem of classifying trajectories on a discretised 2-dimensional manifold modelled by a simplicial complex.
We present an algorithm that aims to learn "optimal holes" to distinguish a set of given trajectory classes.
arXiv Detail & Related papers (2024-12-04T09:11:33Z) - Krylov Cubic Regularized Newton: A Subspace Second-Order Method with
Dimension-Free Convergence Rate [83.3933097134767]
We introduce a novel subspace cubic regularized Newton method that achieves a dimension-independent global convergence rate of $Oleft(frac1mk+frac1k2right)$.
Our method converges faster than existing random subspace methods, especially for high-dimensional problems.
arXiv Detail & Related papers (2024-01-05T20:24:18Z) - Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Hessian Based Smoothing Splines for Manifold Learning [0.228438857884398]
We propose a multidimensional smoothing spline algorithm in the context of manifold learning.
We generalize the bending energy penalty of thin-plate splines to a quadratic form on the Sobolev space of a flat manifold.
The existence and uniqueness of the solution is shown by applying the theory of reproducing Hilbert spaces.
arXiv Detail & Related papers (2023-02-10T02:49:05Z) - Sobolev Spaces, Kernels and Discrepancies over Hyperspheres [4.521119623956821]
This work provides theoretical foundations for kernel methods in the hyperspherical context.
We characterise the native spaces (reproducing kernel Hilbert spaces) and the Sobolev spaces associated with kernels defined over hyperspheres.
Our results have direct consequences for kernel cubature, determining the rate of convergence of the worst case error, and expanding the applicability of cubature algorithms.
arXiv Detail & Related papers (2022-11-16T20:31:38Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - Compressed Empirical Measures (in finite dimensions) [4.73194777046253]
We study approaches for compressing the empirical measure in the context of finite dimensional reproducing kernel Hilbert spaces (RKHSs)
A key quantity that controls how large such a coreset has to be is the size of the largest ball around the empirical measure that is contained within the empirical convex set.
arXiv Detail & Related papers (2022-04-19T12:25:41Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.