An Improved and Generalised Analysis for Spectral Clustering
- URL: http://arxiv.org/abs/2511.23261v1
- Date: Fri, 28 Nov 2025 15:14:27 GMT
- Title: An Improved and Generalised Analysis for Spectral Clustering
- Authors: George Tyler, Luca Zanetti,
- Abstract summary: We revisit the theoretical performances of Spectral Clustering, a classical algorithm for graph partitioning.<n>We show that Spectral Clustering works well as long as the smallest eigenvalues appear in groups well separated from the rest of the matrix representation's spectrum.<n>We demonstrate that our results accurately predict the performances of Spectral Clustering on synthetic and real-world data sets.
- Score: 2.2129910930772003
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We revisit the theoretical performances of Spectral Clustering, a classical algorithm for graph partitioning that relies on the eigenvectors of a matrix representation of the graph. Informally, we show that Spectral Clustering works well as long as the smallest eigenvalues appear in groups well separated from the rest of the matrix representation's spectrum. This arises, for example, whenever there exists a hierarchy of clusters at different scales, a regime not captured by previous analyses. Our results are very general and can be applied beyond the traditional graph Laplacian. In particular, we study Hermitian representations of digraphs and show Spectral Clustering can recover partitions where edges between clusters are oriented mostly in the same direction. This has applications in, for example, the analysis of trophic levels in ecological networks. We demonstrate that our results accurately predict the performances of Spectral Clustering on synthetic and real-world data sets.
Related papers
- Datacube segmentation via Deep Spectral Clustering [76.48544221010424]
Extended Vision techniques often pose a challenge in their interpretation.
The huge dimensionality of data cube spectra poses a complex task in its statistical interpretation.
In this paper, we explore the possibility of applying unsupervised clustering methods in encoded space.
A statistical dimensional reduction is performed by an ad hoc trained (Variational) AutoEncoder, while the clustering process is performed by a (learnable) iterative K-Means clustering algorithm.
arXiv Detail & Related papers (2024-01-31T09:31:28Z) - Spectral Clustering of Attributed Multi-relational Graphs [11.486261673963392]
Graph clustering aims at discovering a natural grouping of the nodes such as similar nodes are assigned to a common cluster.
We propose SpectralMix, a joint dimensionality reduction technique for multi-relational graphs with categorical node attributes.
arXiv Detail & Related papers (2023-11-03T11:05:29Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Hodge-Aware Contrastive Learning [101.56637264703058]
Simplicial complexes prove effective in modeling data with multiway dependencies.
We develop a contrastive self-supervised learning approach for processing simplicial data.
arXiv Detail & Related papers (2023-09-14T00:40:07Z) - A Tighter Analysis of Spectral Clustering, and Beyond [9.759415650727892]
This work studies the classical spectral clustering algorithm which embeds the vertices of some graph $G=(V_G, E_G)$ into $mathbbRk$ using $k$ eigenvectors of some matrix of $G$.
Our first result is a tighter analysis on the performance of spectral clustering, and explains why it works under some much weaker condition than the ones studied in the literature.
For the second result, we show that, by applying fewer than $k$ eigenvectors to construct the embedding, spectral clustering is able to produce better output for many
arXiv Detail & Related papers (2022-08-02T20:18:07Z) - Perfect Spectral Clustering with Discrete Covariates [68.8204255655161]
We propose a spectral algorithm that achieves perfect clustering with high probability on a class of large, sparse networks.
Our method is the first to offer a guarantee of consistent latent structure recovery using spectral clustering.
arXiv Detail & Related papers (2022-05-17T01:41:06Z) - On consistency of constrained spectral clustering under
representation-aware stochastic block model [20.6072287343024]
We study constrained spectral clustering with the aim of finding balanced clusters in a given textitsimilarity graph $mathcalG$.
We develop unnormalized and normalized variants of spectral clustering in this setting.
These algorithms use $mathcalR$ to find clusters in $mathcalG$ that approximately satisfy the proposed constraint.
arXiv Detail & Related papers (2022-03-03T20:41:14Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Strong Consistency, Graph Laplacians, and the Stochastic Block Model [1.2891210250935143]
We study the performance of classical two-step spectral clustering via the graph Laplacian to learn the block model.
We prove that spectral clustering is able to achieve exact recovery of the planted community structure under conditions that match the information-theoretic limits.
arXiv Detail & Related papers (2020-04-21T07:16:46Z) - Robust spectral clustering using LASSO regularization [0.0]
This paper presents a variant of spectral clustering, called 1-spectral clustering, performed on a new random model closely related to block model.
Its goal is to promote a sparse eigenbasis solution of a 1 minimization problem revealing the natural structure of the graph.
arXiv Detail & Related papers (2020-04-08T07:12:56Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.