Linear Convergence of the Subspace Constrained Mean Shift Algorithm:
From Euclidean to Directional Data
- URL: http://arxiv.org/abs/2104.14977v1
- Date: Thu, 29 Apr 2021 01:46:35 GMT
- Title: Linear Convergence of the Subspace Constrained Mean Shift Algorithm:
From Euclidean to Directional Data
- Authors: Yikun Zhang and Yen-Chi Chen
- Abstract summary: We argue that the SCMS algorithm is a special variant of a subspace constrained gradient ascent algorithm with an adaptive step size.
We prove the linear convergence of our proposed directional SCMS algorithm.
- Score: 3.60425753550939
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper studies linear convergence of the subspace constrained mean shift
(SCMS) algorithm, a well-known algorithm for identifying a density ridge
defined by a kernel density estimator. By arguing that the SCMS algorithm is a
special variant of a subspace constrained gradient ascent (SCGA) algorithm with
an adaptive step size, we derive linear convergence of such SCGA algorithm.
While the existing research focuses mainly on density ridges in the Euclidean
space, we generalize density ridges and the SCMS algorithm to directional data.
In particular, we establish the stability theorem of density ridges with
directional data and prove the linear convergence of our proposed directional
SCMS algorithm.
Related papers
- Robust Kernel Sparse Subspace Clustering [0.0]
We propose for the first time robust kernel sparse SC (RKSSC) algorithm for data with gross sparse corruption.
The concept, in principle, can be applied to other SC algorithms to achieve robustness to the presence of such type of corruption.
arXiv Detail & Related papers (2024-01-30T14:12:39Z) - Stability and Generalization of the Decentralized Stochastic Gradient
Descent Ascent Algorithm [80.94861441583275]
We investigate the complexity of the generalization bound of the decentralized gradient descent (D-SGDA) algorithm.
Our results analyze the impact of different top factors on the generalization of D-SGDA.
We also balance it with the generalization to obtain the optimal convex-concave setting.
arXiv Detail & Related papers (2023-10-31T11:27:01Z) - Distributed Linear Regression with Compositional Covariates [5.085889377571319]
We focus on the distributed sparse penalized linear log-contrast model in massive compositional data.
Two distributed optimization techniques are proposed for solving the two different constrained convex optimization problems.
In the decentralized topology, we introduce a distributed coordinate-wise descent algorithm for obtaining a communication-efficient regularized estimation.
arXiv Detail & Related papers (2023-10-21T11:09:37Z) - Linearized Wasserstein dimensionality reduction with approximation
guarantees [65.16758672591365]
LOT Wassmap is a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space.
We show that LOT Wassmap attains correct embeddings and that the quality improves with increased sample size.
We also show how LOT Wassmap significantly reduces the computational cost when compared to algorithms that depend on pairwise distance computations.
arXiv Detail & Related papers (2023-02-14T22:12:16Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Algorithms for ridge estimation with convergence guarantees [1.90365714903665]
We consider the extraction of filamentary structure from a point cloud.
The filaments are modeled as ridge lines or higher dimensional ridges of an underlying density.
We propose two novel algorithms, and provide theoretical guarantees for their convergences.
arXiv Detail & Related papers (2021-04-26T01:57:04Z) - The EM Perspective of Directional Mean Shift Algorithm [3.60425753550939]
The directional mean shift (DMS) algorithm is a nonparametric method for pursuing local modes of densities defined by kernel density estimators on the unit hypersphere.
We show that any DMS can be viewed as a generalized Expectation-Maximization (EM) algorithm.
arXiv Detail & Related papers (2021-01-25T13:17:12Z) - Linear Convergent Decentralized Optimization with Compression [50.44269451541387]
Existing decentralized algorithms with compression mainly focus on compressing DGD-type algorithms.
Motivated by primal-dual algorithms, this paper proposes first underlineLinunderlineEAr convergent.
underlineDecentralized with compression, LEAD.
arXiv Detail & Related papers (2020-07-01T04:35:00Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.