A Granular Grassmannian Clustering Framework via the Schubert Variety of Best Fit
- URL: http://arxiv.org/abs/2512.23766v1
- Date: Mon, 29 Dec 2025 02:28:45 GMT
- Title: A Granular Grassmannian Clustering Framework via the Schubert Variety of Best Fit
- Authors: Karim Salta, Michael Kirby, Chris Peterson,
- Abstract summary: In many classification and clustering tasks, it is useful to compute a geometric representative for a dataset or a cluster, such as a mean or median.<n>We introduce a subspace clustering algorithm that replaces subspace means with a trainable prototype defined as a Schubert Variety of Best Fit (SVBF)<n> Integrated in the Linde-Buzo-Grey (LBG) pipeline, this SVBF-LBG scheme yields improved cluster purity on synthetic, image, spectral, and video action data.
- Score: 2.386867140645281
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In many classification and clustering tasks, it is useful to compute a geometric representative for a dataset or a cluster, such as a mean or median. When datasets are represented by subspaces, these representatives become points on the Grassmann or flag manifold, with distances induced by their geometry, often via principal angles. We introduce a subspace clustering algorithm that replaces subspace means with a trainable prototype defined as a Schubert Variety of Best Fit (SVBF) - a subspace that comes as close as possible to intersecting each cluster member in at least one fixed direction. Integrated in the Linde-Buzo-Grey (LBG) pipeline, this SVBF-LBG scheme yields improved cluster purity on synthetic, image, spectral, and video action data, while retaining the mathematical structure required for downstream analysis.
Related papers
- Wasserstein-Aligned Hyperbolic Multi-View Clustering [58.29261653100388]
This paper proposes a novel Wasserstein-Aligned Hyperbolic (WAH) framework for multi-view clustering.<n>Our method exploits a view-specific hyperbolic encoder for each view to embed features into the Lorentz manifold for hierarchical semantic modeling.
arXiv Detail & Related papers (2025-12-10T07:56:19Z) - Multiscale Grassmann Manifolds for Single-Cell Data Analysis [3.073258665974412]
We propose a multiscale framework that integrates machine learning with subspace geometry for single-cell data analysis.<n>A power-based scale sampling function is introduced to control the selection of scales and balance in- formation across resolutions.<n>Experiments on nine benchmark single-cell RNA-seq datasets demonstrate that the proposed approach effectively preserves meaningful structures.
arXiv Detail & Related papers (2025-11-12T19:47:10Z) - Scalable Context-Preserving Model-Aware Deep Clustering for Hyperspectral Images [51.95768218975529]
Subspace clustering has become widely adopted for the unsupervised analysis of hyperspectral images (HSIs)<n>Recent model-aware deep subspace clustering methods often use a two-stage framework, involving the calculation of a self-representation matrix with complexity of O(n2), followed by spectral clustering.<n>We propose a scalable, context-preserving deep clustering method based on basis representation, which jointly captures local and non-local structures for efficient HSI clustering.
arXiv Detail & Related papers (2025-06-12T16:43:09Z) - Structural-Spectral Graph Convolution with Evidential Edge Learning for Hyperspectral Image Clustering [73.01141916544103]
Hyperspectral image (HSI) clustering groups pixels into clusters without labeled data.<n>Most methods rely on superpixel segmentation and perform superpixel-level clustering based on graph neural networks (GNNs)<n>We propose a structural-spectral graph convolutional operator (SSGCO) tailored for graph-structured HSI superpixels to improve representation quality.
arXiv Detail & Related papers (2025-06-11T16:41:34Z) - Distributional Reduction: Unifying Dimensionality Reduction and Clustering with Gromov-Wasserstein [56.62376364594194]
Unsupervised learning aims to capture the underlying structure of potentially large and high-dimensional datasets.<n>In this work, we revisit these approaches under the lens of optimal transport and exhibit relationships with the Gromov-Wasserstein problem.<n>This unveils a new general framework, called distributional reduction, that recovers DR and clustering as special cases and allows addressing them jointly within a single optimization problem.
arXiv Detail & Related papers (2024-02-03T19:00:19Z) - Multilayer Graph Approach to Deep Subspace Clustering [0.0]
Deep subspace clustering (DSC) networks based on self-expressive model learn representation matrix, often implemented in terms of fully connected network.
Here, we apply selected linear subspace clustering algorithm to learn representation from representations learned by all layers of encoder network including the input data.
We validate proposed approach on four well-known datasets with two DSC networks as baseline models.
arXiv Detail & Related papers (2024-01-30T14:09:41Z) - Adaptive Graph Convolutional Subspace Clustering [10.766537212211217]
Spectral-type subspace clustering algorithms have shown excellent performance in many subspace clustering applications.
In this paper, inspired by graph convolutional networks, we use the graph convolution technique to develop a feature extraction method and a coefficient matrix constraint simultaneously.
We claim that by using AGCSC, the aggregated feature representation of original data samples is suitable for subspace clustering.
arXiv Detail & Related papers (2023-05-05T10:27:23Z) - Adaptively-weighted Integral Space for Fast Multiview Clustering [54.177846260063966]
We propose an Adaptively-weighted Integral Space for Fast Multiview Clustering (AIMC) with nearly linear complexity.
Specifically, view generation models are designed to reconstruct the view observations from the latent integral space.
Experiments conducted on several realworld datasets confirm the superiority of the proposed AIMC method.
arXiv Detail & Related papers (2022-08-25T05:47:39Z) - flow-based clustering and spectral clustering: a comparison [0.688204255655161]
We study a novel graph clustering method for data with an intrinsic network structure.
We exploit an intrinsic network structure of data to construct Euclidean feature vectors.
Our results indicate that our clustering methods can cope with certain graph structures.
arXiv Detail & Related papers (2022-06-20T21:49:52Z) - Subspace clustering in high-dimensions: Phase transitions \&
Statistical-to-Computational gap [24.073221004661427]
A simple model to study subspace clustering is the high-dimensional $k$-Gaussian mixture model.
We provide an exact characterization of the statistically optimal reconstruction error in this model in the high-dimensional regime with extensive sparsity.
arXiv Detail & Related papers (2022-05-26T17:47:35Z) - Cluster Representatives Selection in Non-Metric Spaces for Nearest
Prototype Classification [4.176752121302988]
In this paper, we present CRS, a novel method for selecting a small yet representative subset of objects as a cluster prototype.
Memory and computationally efficient selection of representatives is enabled by leveraging the similarity graph representation of each cluster created by the NN-Descent algorithm.
CRS can be used in an arbitrary metric or non-metric space because of the graph-based approach, which requires only a pairwise similarity measure.
arXiv Detail & Related papers (2021-07-03T04:51:07Z) - Spatial-Spectral Clustering with Anchor Graph for Hyperspectral Image [88.60285937702304]
This paper proposes a novel unsupervised approach called spatial-spectral clustering with anchor graph (SSCAG) for HSI data clustering.
The proposed SSCAG is competitive against the state-of-the-art approaches.
arXiv Detail & Related papers (2021-04-24T08:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.