G-invariant diffusion maps
- URL: http://arxiv.org/abs/2306.07350v3
- Date: Wed, 7 Aug 2024 15:36:15 GMT
- Title: G-invariant diffusion maps
- Authors: Eitan Rosen, Xiuyuan Cheng, Yoel Shkolnisky,
- Abstract summary: We derive diffusion maps that intrinsically account for the group action on the data.
In particular, we construct both equivariant and invariant embeddings, which can be used to cluster and align the data points.
- Score: 11.852406625172216
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The diffusion maps embedding of data lying on a manifold has shown success in tasks such as dimensionality reduction, clustering, and data visualization. In this work, we consider embedding data sets that were sampled from a manifold which is closed under the action of a continuous matrix group. An example of such a data set is images whose planar rotations are arbitrary. The G-invariant graph Laplacian, introduced in Part I of this work, admits eigenfunctions in the form of tensor products between the elements of the irreducible unitary representations of the group and eigenvectors of certain matrices. We employ these eigenfunctions to derive diffusion maps that intrinsically account for the group action on the data. In particular, we construct both equivariant and invariant embeddings, which can be used to cluster and align the data points. We demonstrate the utility of our construction in the problem of random computerized tomography.
Related papers
- Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Improving embedding of graphs with missing data by soft manifolds [51.425411400683565]
The reliability of graph embeddings depends on how much the geometry of the continuous space matches the graph structure.
We introduce a new class of manifold, named soft manifold, that can solve this situation.
Using soft manifold for graph embedding, we can provide continuous spaces to pursue any task in data analysis over complex datasets.
arXiv Detail & Related papers (2023-11-29T12:48:33Z) - A Geometric Insight into Equivariant Message Passing Neural Networks on
Riemannian Manifolds [1.0878040851638]
We argue that the metric attached to a coordinate-independent feature field should optimally preserve the principal bundle's original metric.
We obtain a message passing scheme on the manifold by discretizing the diffusion equation flow for a fixed time step.
The discretization of the higher-order diffusion process on a graph yields a new general class of equivariant GNN.
arXiv Detail & Related papers (2023-10-16T14:31:13Z) - The G-invariant graph Laplacian [12.676094208872842]
We consider data sets whose data points lie on a manifold that is closed under the action of a known unitary Lie group G.
We construct the graph Laplacian by incorporating the distances between all the pairs of points generated by the action of G on the data set.
We show that the G-GL converges to the Laplace-Beltrami operator on the data manifold, while enjoying a significantly improved convergence rate.
arXiv Detail & Related papers (2023-03-29T20:07:07Z) - Diffusion Maps for Group-Invariant Manifolds [1.90365714903665]
We consider the manifold learning problem when the data set is invariant under the action of a compact Lie group $K$.
Our approach consists in augmenting the data-induced graph Laplacian by integrating over the $K$-orbits of the existing data points.
We show that the normalized Laplacian operator $L_N$ converges to the Laplace-Beltrami operator of the data manifold with an improved convergence rate.
arXiv Detail & Related papers (2023-03-28T17:30:35Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - The Manifold Scattering Transform for High-Dimensional Point Cloud Data [16.500568323161563]
We present practical schemes for implementing the manifold scattering transform to datasets arising in naturalistic systems.
We show that our methods are effective for signal classification and manifold classification tasks.
arXiv Detail & Related papers (2022-06-21T02:15:00Z) - Laplacian-Based Dimensionality Reduction Including Spectral Clustering,
Laplacian Eigenmap, Locality Preserving Projection, Graph Embedding, and
Diffusion Map: Tutorial and Survey [5.967999555890417]
We first introduce adjacency matrix, definition of Laplacian matrix, and the interpretation of Laplacian.
Different optimization variants of Laplacian eigenmap and its out-of-sample extension are explained.
Versions of graph embedding are then explained which are generalized versions of Laplacian eigenmap.
Finally, diffusion map is introduced which is a method based on Laplacian of data and random walks on the data graph.
arXiv Detail & Related papers (2021-06-03T22:10:40Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.