Laplacian-Based Dimensionality Reduction Including Spectral Clustering,
Laplacian Eigenmap, Locality Preserving Projection, Graph Embedding, and
Diffusion Map: Tutorial and Survey
- URL: http://arxiv.org/abs/2106.02154v1
- Date: Thu, 3 Jun 2021 22:10:40 GMT
- Title: Laplacian-Based Dimensionality Reduction Including Spectral Clustering,
Laplacian Eigenmap, Locality Preserving Projection, Graph Embedding, and
Diffusion Map: Tutorial and Survey
- Authors: Benyamin Ghojogh, Ali Ghodsi, Fakhri Karray, Mark Crowley
- Abstract summary: We first introduce adjacency matrix, definition of Laplacian matrix, and the interpretation of Laplacian.
Different optimization variants of Laplacian eigenmap and its out-of-sample extension are explained.
Versions of graph embedding are then explained which are generalized versions of Laplacian eigenmap.
Finally, diffusion map is introduced which is a method based on Laplacian of data and random walks on the data graph.
- Score: 5.967999555890417
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This is a tutorial and survey paper for nonlinear dimensionality and feature
extraction methods which are based on the Laplacian of graph of data. We first
introduce adjacency matrix, definition of Laplacian matrix, and the
interpretation of Laplacian. Then, we cover the cuts of graph and spectral
clustering which applies clustering in a subspace of data. Different
optimization variants of Laplacian eigenmap and its out-of-sample extension are
explained. Thereafter, we introduce the locality preserving projection and its
kernel variant as linear special cases of Laplacian eigenmap. Versions of graph
embedding are then explained which are generalized versions of Laplacian
eigenmap and locality preserving projection. Finally, diffusion map is
introduced which is a method based on Laplacian of data and random walks on the
data graph.
Related papers
- Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - A kernel-based analysis of Laplacian Eigenmaps [1.5229257192293204]
We study the spectral properties of the associated empirical graph Laplacian based on a Gaussian kernel.
Our main results are non-asymptotic error bounds, showing that the eigenvalues and eigenspaces of the empirical graph Laplacian are close to the eigenvalues and eigenspaces of the Laplace-Beltrami operator of $mathcalM$.
arXiv Detail & Related papers (2024-02-26T11:00:09Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - Random Projections of Sparse Adjacency Matrices [0.0]
We show that our random projections retain the functionality of their underlying adjacency matrices while having extra properties that make them attractive as dynamic graph representations.
We conclude by characterizing our random projection as a distance-preserving map of adjacency matrices analogous to the usual Johnson-Lindenstrauss map.
arXiv Detail & Related papers (2023-09-04T04:53:52Z) - G-invariant diffusion maps [11.852406625172216]
We derive diffusion maps that intrinsically account for the group action on the data.
In particular, we construct both equivariant and invariant embeddings, which can be used to cluster and align the data points.
arXiv Detail & Related papers (2023-06-12T18:16:33Z) - DiGress: Discrete Denoising diffusion for graph generation [79.13904438217592]
DiGress is a discrete denoising diffusion model for generating graphs with categorical node and edge attributes.
It achieves state-of-the-art performance on molecular and non-molecular datasets, with up to 3x validity improvement.
It is also the first model to scale to the large GuacaMol dataset containing 1.3M drug-like molecules.
arXiv Detail & Related papers (2022-09-29T12:55:03Z) - The Manifold Hypothesis for Gradient-Based Explanations [55.01671263121624]
gradient-based explanation algorithms provide perceptually-aligned explanations.
We show that the more a feature attribution is aligned with the tangent space of the data, the more perceptually-aligned it tends to be.
We suggest that explanation algorithms should actively strive to align their explanations with the data manifold.
arXiv Detail & Related papers (2022-06-15T08:49:24Z) - Learning Sparse Graph with Minimax Concave Penalty under Gaussian Markov
Random Fields [51.07460861448716]
This paper presents a convex-analytic framework to learn from data.
We show that a triangular convexity decomposition is guaranteed by a transform of the corresponding to its upper part.
arXiv Detail & Related papers (2021-09-17T17:46:12Z) - Grassmannian diffusion maps based dimension reduction and classification
for high-dimensional data [0.0]
novel nonlinear dimensionality reduction technique that defines the affinity between points through their representation as low-dimensional subspaces corresponding to points on the Grassmann manifold.
The method is designed for applications, such as image recognition and data-based classification of high-dimensional data that can be compactly represented in a lower dimensional subspace.
arXiv Detail & Related papers (2020-09-16T08:32:02Z) - Spatial Pyramid Based Graph Reasoning for Semantic Segmentation [67.47159595239798]
We apply graph convolution into the semantic segmentation task and propose an improved Laplacian.
The graph reasoning is directly performed in the original feature space organized as a spatial pyramid.
We achieve comparable performance with advantages in computational and memory overhead.
arXiv Detail & Related papers (2020-03-23T12:28:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.