Nested subspace learning with flags
- URL: http://arxiv.org/abs/2502.06022v1
- Date: Sun, 09 Feb 2025 20:29:56 GMT
- Title: Nested subspace learning with flags
- Authors: Tom Szwagier, Xavier Pennec,
- Abstract summary: We propose a simple trick to enforce nestedness in subspace learning methods.
We apply the flag trick to several classical machine learning methods and show that it successfully addresses the nestedness issue.
- Score: 0.03295497357381917
- License:
- Abstract: Many machine learning methods look for low-dimensional representations of the data. The underlying subspace can be estimated by first choosing a dimension $q$ and then optimizing a certain objective function over the space of $q$-dimensional subspaces (the Grassmannian). Trying different $q$ yields in general non-nested subspaces, which raises an important issue of consistency between the data representations. In this paper, we propose a simple trick to enforce nestedness in subspace learning methods. It consists in lifting Grassmannian optimization problems to flag manifolds (the space of nested subspaces of increasing dimension) via nested projectors. We apply the flag trick to several classical machine learning methods and show that it successfully addresses the nestedness issue.
Related papers
- Krylov Cubic Regularized Newton: A Subspace Second-Order Method with
Dimension-Free Convergence Rate [83.3933097134767]
We introduce a novel subspace cubic regularized Newton method that achieves a dimension-independent global convergence rate of $Oleft(frac1mk+frac1k2right)$.
Our method converges faster than existing random subspace methods, especially for high-dimensional problems.
arXiv Detail & Related papers (2024-01-05T20:24:18Z) - Chordal Averaging on Flag Manifolds and Its Applications [22.357999963733302]
This paper presents a new, provably-convergent algorithm for computing the flag-mean and flag-median of a set of points on a flag manifold under the chordal metric.
arXiv Detail & Related papers (2023-03-23T17:57:28Z) - Subspace Learning Machine (SLM): Methodology and Performance [28.98486923400986]
Subspace learning machine (SLM) is a new classification model inspired by feedforward multilayer perceptron (FF-MLP), decision tree (DT) and extreme learning machine (ELM)
SLM first identifies a discriminant subspace, $S0$, by examining the discriminant power of each input feature.
It uses probabilistic projections of features in $S0$ to yield 1D subspaces and finds the optimal partition for each of them.
arXiv Detail & Related papers (2022-05-11T06:44:51Z) - Switch Spaces: Learning Product Spaces with Sparse Gating [48.591045282317424]
We propose Switch Spaces, a data-driven approach for learning representations in product space.
We introduce sparse gating mechanisms that learn to choose, combine and switch spaces.
Experiments on knowledge graph completion and item recommendations show that the proposed switch space achieves new state-of-the-art performances.
arXiv Detail & Related papers (2021-02-17T11:06:59Z) - Stochastic Linear Bandits with Protected Subspace [51.43660657268171]
We study a variant of the linear bandit problem wherein we optimize a linear objective function but rewards are accrued only to an unknown subspace.
In particular, at each round, the learner must choose whether to query the objective or the protected subspace alongside choosing an action.
Our algorithm, derived from the OFUL principle, uses some of the queries to get an estimate of the protected space.
arXiv Detail & Related papers (2020-11-02T14:59:39Z) - The flag manifold as a tool for analyzing and comparing data sets [4.864283334686195]
The shape and orientation of data clouds reflect variability in observations that can confound pattern recognition systems.
We show how nested subspace methods, utilizing flag manifold, can help to deal with such additional confounding factors.
arXiv Detail & Related papers (2020-06-24T22:29:02Z) - Robust Large-Margin Learning in Hyperbolic Space [64.42251583239347]
We present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space.
We provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples.
We prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees.
arXiv Detail & Related papers (2020-04-11T19:11:30Z) - Learnable Subspace Clustering [76.2352740039615]
We develop a learnable subspace clustering paradigm to efficiently solve the large-scale subspace clustering problem.
The key idea is to learn a parametric function to partition the high-dimensional subspaces into their underlying low-dimensional subspaces.
To the best of our knowledge, this paper is the first work to efficiently cluster millions of data points among the subspace clustering methods.
arXiv Detail & Related papers (2020-04-09T12:53:28Z) - On a minimum enclosing ball of a collection of linear subspaces [16.466372560527827]
This paper concerns the minimax center of a collection of linear subspaces.
We propose and solve an optimization problem parametrized by the rank of the minimax center.
By scaling the objective and penalizing the information lost by the rank-$k$ minimax center, we jointly recover an optimal dimension, $k*$, and a central subspace, $U* in$ Gr$(k*,n)$ at the center of the minimum enclosing ball, that best represents the data.
arXiv Detail & Related papers (2020-03-27T15:00:49Z) - Ellipsoidal Subspace Support Vector Data Description [98.67884574313292]
We propose a novel method for transforming data into a low-dimensional space optimized for one-class classification.
We provide both linear and non-linear formulations for the proposed method.
The proposed method is noticed to converge much faster than recently proposed Subspace Support Vector Data Description.
arXiv Detail & Related papers (2020-03-20T21:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.