Horospherical Decision Boundaries for Large Margin Classification in
Hyperbolic Space
- URL: http://arxiv.org/abs/2302.06807v3
- Date: Thu, 28 Sep 2023 17:12:16 GMT
- Title: Horospherical Decision Boundaries for Large Margin Classification in
Hyperbolic Space
- Authors: Xiran Fan, Chun-Hao Yang, Baba C. Vemuri
- Abstract summary: We propose a novel convex descent problem that can be optimized using a large margin setting.
We present several experiments depicting the competitive performance of our algorithms in comparison to SOTA.
- Score: 8.901073744693317
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hyperbolic spaces have been quite popular in the recent past for representing
hierarchically organized data. Further, several classification algorithms for
data in these spaces have been proposed in the literature. These algorithms
mainly use either hyperplanes or geodesics for decision boundaries in a large
margin classifiers setting leading to a non-convex optimization problem. In
this paper, we propose a novel large margin classifier based on horospherical
decision boundaries that leads to a geodesically convex optimization problem
that can be optimized using any Riemannian gradient descent technique
guaranteeing a globally optimal solution. We present several experiments
depicting the competitive performance of our classifier in comparison to SOTA.
Related papers
- Boosting K-means for Big Data by Fusing Data Streaming with Global Optimization [0.3069335774032178]
K-means clustering is a cornerstone of data mining, but its efficiency deteriorates when confronted with massive datasets.
We propose a novel algorithm that leverages the Variable Neighborhood Search (VNS) metaheuristic to optimize K-means clustering for big data.
arXiv Detail & Related papers (2024-10-18T15:43:34Z) - Convex Relaxation for Solving Large-Margin Classifiers in Hyperbolic Space [29.564717407207528]
Hyperbolic spaces have been recognized for their outstanding performance in handling data.
Previous attempts to solve this problem using a gradient descent approach have failed.
In this paper we show how we can effectively approximate the optima using sparse moment of relaxations.
arXiv Detail & Related papers (2024-05-27T14:19:53Z) - Differentially Private Optimization with Sparse Gradients [60.853074897282625]
We study differentially private (DP) optimization problems under sparsity of individual gradients.
Building on this, we obtain pure- and approximate-DP algorithms with almost optimal rates for convex optimization with sparse gradients.
arXiv Detail & Related papers (2024-04-16T20:01:10Z) - Semi-Supervised Laplace Learning on Stiefel Manifolds [48.3427853588646]
We develop the framework Sequential Subspace for graph-based, supervised samples at low-label rates.
We achieves that our methods at extremely low rates, and high label rates.
arXiv Detail & Related papers (2023-07-31T20:19:36Z) - Research on Efficient Fuzzy Clustering Method Based on Local Fuzzy
Granular balls [67.33923111887933]
In this paper, the data is fuzzy iterated using granular-balls, and the membership degree of data only considers the two granular-balls where it is located.
The formed fuzzy granular-balls set can use more processing methods in the face of different data scenarios.
arXiv Detail & Related papers (2023-03-07T01:52:55Z) - Fuzzy Clustering by Hyperbolic Smoothing [0.0]
We propose a novel method for building fuzzy clusters of large data sets, using a smoothing numerical approach.
The smoothing allows a conversion from a strongly non-differentiable problem into differentiable subproblems of optimization without constraints of low dimension.
arXiv Detail & Related papers (2022-07-09T12:40:46Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - CobBO: Coordinate Backoff Bayesian Optimization [45.53400129323848]
We introduce Coordinate Backoff Bayesian Optimization (CobBO) to capture a smooth approximation of the global landscape.
CobBO finds solutions comparable to or better than other state-of-the-art methods for dimensions ranging from tens to hundreds.
arXiv Detail & Related papers (2021-01-13T15:39:32Z) - Efficient Methods for Structured Nonconvex-Nonconcave Min-Max
Optimization [98.0595480384208]
We propose a generalization extraient spaces which converges to a stationary point.
The algorithm applies not only to general $p$-normed spaces, but also to general $p$-dimensional vector spaces.
arXiv Detail & Related papers (2020-10-31T21:35:42Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.