G-invariant diffusion maps
- URL: http://arxiv.org/abs/2306.07350v2
- Date: Tue, 25 Jul 2023 13:44:43 GMT
- Title: G-invariant diffusion maps
- Authors: Eitan Rosen and Xiuyuan Cheng and Yoel Shkolnisky
- Abstract summary: We derive diffusion maps that intrinsically account for the group action on the data.
In particular, we construct both equivariant and invariant embeddings which can be used naturally to cluster and align the data points.
- Score: 8.271859911016719
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The diffusion maps embedding of data lying on a manifold have shown success
in tasks ranging from dimensionality reduction and clustering, to data
visualization. In this work, we consider embedding data sets which were sampled
from a manifold which is closed under the action of a continuous matrix group.
An example of such a data set is images who's planar rotations are arbitrary.
The G-invariant graph Laplacian, introduced in a previous work of the authors,
admits eigenfunctions in the form of tensor products between the elements of
the irreducible unitary representations of the group and eigenvectors of
certain matrices. We employ these eigenfunctions to derive diffusion maps that
intrinsically account for the group action on the data. In particular, we
construct both equivariant and invariant embeddings which can be used naturally
to cluster and align the data points. We demonstrate the effectiveness of our
construction with simulated data.
Related papers
- Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - The G-invariant graph Laplacian [12.676094208872842]
We consider data sets whose data points lie on a manifold that is closed under the action of a known unitary Lie group G.
We construct the graph Laplacian by incorporating the distances between all the pairs of points generated by the action of G on the data set.
We show that the G-GL converges to the Laplace-Beltrami operator on the data manifold, while enjoying a significantly improved convergence rate.
arXiv Detail & Related papers (2023-03-29T20:07:07Z) - Diffusion Maps for Group-Invariant Manifolds [1.90365714903665]
We consider the manifold learning problem when the data set is invariant under the action of a compact Lie group $K$.
Our approach consists in augmenting the data-induced graph Laplacian by integrating over the $K$-orbits of the existing data points.
We show that the normalized Laplacian operator $L_N$ converges to the Laplace-Beltrami operator of the data manifold with an improved convergence rate.
arXiv Detail & Related papers (2023-03-28T17:30:35Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - The Manifold Scattering Transform for High-Dimensional Point Cloud Data [16.500568323161563]
We present practical schemes for implementing the manifold scattering transform to datasets arising in naturalistic systems.
We show that our methods are effective for signal classification and manifold classification tasks.
arXiv Detail & Related papers (2022-06-21T02:15:00Z) - The Manifold Hypothesis for Gradient-Based Explanations [55.01671263121624]
gradient-based explanation algorithms provide perceptually-aligned explanations.
We show that the more a feature attribution is aligned with the tangent space of the data, the more perceptually-aligned it tends to be.
We suggest that explanation algorithms should actively strive to align their explanations with the data manifold.
arXiv Detail & Related papers (2022-06-15T08:49:24Z) - Laplacian-Based Dimensionality Reduction Including Spectral Clustering,
Laplacian Eigenmap, Locality Preserving Projection, Graph Embedding, and
Diffusion Map: Tutorial and Survey [5.967999555890417]
We first introduce adjacency matrix, definition of Laplacian matrix, and the interpretation of Laplacian.
Different optimization variants of Laplacian eigenmap and its out-of-sample extension are explained.
Versions of graph embedding are then explained which are generalized versions of Laplacian eigenmap.
Finally, diffusion map is introduced which is a method based on Laplacian of data and random walks on the data graph.
arXiv Detail & Related papers (2021-06-03T22:10:40Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.