Disentangling the Spectral Properties of the Hodge Laplacian: Not All Small Eigenvalues Are Equal
- URL: http://arxiv.org/abs/2311.14427v2
- Date: Tue, 26 Mar 2024 16:15:40 GMT
- Title: Disentangling the Spectral Properties of the Hodge Laplacian: Not All Small Eigenvalues Are Equal
- Authors: Vincent P. Grande, Michael T. Schaub,
- Abstract summary: The Hodge Laplacian has come into focus as a generalisation of the ordinary Laplacian for higher-order graph models such as simplicial and cellular complexes.
We introduce the notion of persistent eigenvector similarity and provide a method to track individual harmonic, curl, and gradient eigenvectors/-values.
We also use our insights to introduce a novel form of Hodge spectral clustering and to classify edges and higher-order simplices.
- Score: 5.079602839359521
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The rich spectral information of the graph Laplacian has been instrumental in graph theory, machine learning, and graph signal processing for applications such as graph classification, clustering, or eigenmode analysis. Recently, the Hodge Laplacian has come into focus as a generalisation of the ordinary Laplacian for higher-order graph models such as simplicial and cellular complexes. Akin to the traditional analysis of graph Laplacians, many authors analyse the smallest eigenvalues of the Hodge Laplacian, which are connected to important topological properties such as homology. However, small eigenvalues of the Hodge Laplacian can carry different information depending on whether they are related to curl or gradient eigenmodes, and thus may not be comparable. We therefore introduce the notion of persistent eigenvector similarity and provide a method to track individual harmonic, curl, and gradient eigenvectors/-values through the so-called persistence filtration, leveraging the full information contained in the Hodge-Laplacian spectrum across all possible scales of a point cloud. Finally, we use our insights (a) to introduce a novel form of Hodge spectral clustering and (b) to classify edges and higher-order simplices based on their relationship to the smallest harmonic, curl, and gradient eigenvectors.
Related papers
- Towards Stable, Globally Expressive Graph Representations with Laplacian Eigenvectors [29.055130767451036]
We propose a novel method exploiting Laplacian eigenvectors to generate stable and globally expressive graph representations.
Our method deals with numerically close eigenvalues in a smooth fashion, ensuring its better robustness against perturbations.
arXiv Detail & Related papers (2024-10-13T06:02:25Z) - Generation is better than Modification: Combating High Class Homophily Variance in Graph Anomaly Detection [51.11833609431406]
Homophily distribution differences between different classes are significantly greater than those in homophilic and heterophilic graphs.
We introduce a new metric called Class Homophily Variance, which quantitatively describes this phenomenon.
To mitigate its impact, we propose a novel GNN model named Homophily Edge Generation Graph Neural Network (HedGe)
arXiv Detail & Related papers (2024-03-15T14:26:53Z) - Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - A kernel-based analysis of Laplacian Eigenmaps [1.5229257192293204]
We study the spectral properties of the associated empirical graph Laplacian based on a Gaussian kernel.
Our main results are non-asymptotic error bounds, showing that the eigenvalues and eigenspaces of the empirical graph Laplacian are close to the eigenvalues and eigenspaces of the Laplace-Beltrami operator of $mathcalM$.
arXiv Detail & Related papers (2024-02-26T11:00:09Z) - Improving Expressive Power of Spectral Graph Neural Networks with Eigenvalue Correction [55.57072563835959]
spectral graph neural networks are characterized by filters.
We propose an eigenvalue correction strategy that can free filters from the constraints of repeated eigenvalue inputs.
arXiv Detail & Related papers (2024-01-28T08:12:00Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Stochastic Parallelizable Eigengap Dilation for Large Graph Clustering [12.544602297450533]
We aim to identify clusters of nodes where most edges fall within clusters and only few edges fall between clusters.
A core step of spectral clustering is performing an eigendecomposition of the corresponding graph Laplacian matrix.
This paper introduces a parallelizable approach to dilating the spectrum in order to accelerate SVD solvers and in turn, spectral clustering.
arXiv Detail & Related papers (2022-07-29T10:13:07Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Sign and Basis Invariant Networks for Spectral Graph Representation
Learning [75.18802152811539]
We introduce SignNet and BasisNet -- new neural architectures that are invariant to all requisite symmetries and hence process collections of eigenspaces in a principled manner.
Our networks are theoretically strong for graph representation learning -- they can approximate any spectral graph convolution.
Experiments show the strength of our networks for learning spectral graph filters and learning graph positional encodings.
arXiv Detail & Related papers (2022-02-25T23:11:59Z) - Outlier Detection for Trajectories via Flow-embeddings [2.66418345185993]
We propose a method to detect outliers in empirically observed trajectories on a discretized manifold modeled by a simplicial complex.
Our approach is similar to spectral embeddings such as diffusion-maps and Laplacian eigenmaps, that construct embeddings from the eigenvectors of the graph Laplacian associated with low eigenvalues.
We show how this technique can single out trajectories that behave (topologically) different compared to typical trajectories, and illustrate the performance of our approach with both synthetic and empirical data.
arXiv Detail & Related papers (2021-11-25T19:58:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.