Diffusion Maps for Group-Invariant Manifolds
- URL: http://arxiv.org/abs/2303.16169v2
- Date: Mon, 3 Apr 2023 16:55:54 GMT
- Title: Diffusion Maps for Group-Invariant Manifolds
- Authors: Paulina Hoyos and Joe Kileel
- Abstract summary: We consider the manifold learning problem when the data set is invariant under the action of a compact Lie group $K$.
Our approach consists in augmenting the data-induced graph Laplacian by integrating over the $K$-orbits of the existing data points.
We show that the normalized Laplacian operator $L_N$ converges to the Laplace-Beltrami operator of the data manifold with an improved convergence rate.
- Score: 1.90365714903665
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this article, we consider the manifold learning problem when the data set
is invariant under the action of a compact Lie group $K$. Our approach consists
in augmenting the data-induced graph Laplacian by integrating over the
$K$-orbits of the existing data points, which yields a $K$-invariant graph
Laplacian $L$. We prove that $L$ can be diagonalized by using the unitary
irreducible representation matrices of $K$, and we provide an explicit formula
for computing its eigenvalues and eigenfunctions. In addition, we show that the
normalized Laplacian operator $L_N$ converges to the Laplace-Beltrami operator
of the data manifold with an improved convergence rate, where the improvement
grows with the dimension of the symmetry group $K$. This work extends the
steerable graph Laplacian framework of Landa and Shkolnisky from the case of
$\operatorname{SO}(2)$ to arbitrary compact Lie groups.
Related papers
- Equivariant Manifold Neural ODEs and Differential Invariants [1.6073704837297416]
We develop a manifestly geometric framework for equivariant manifold neural ordinary differential equations (NODEs)
We use it to analyse their modelling capabilities for symmetric data.
arXiv Detail & Related papers (2024-01-25T12:23:22Z) - Lie Group Decompositions for Equivariant Neural Networks [12.139222986297261]
We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
arXiv Detail & Related papers (2023-10-17T16:04:33Z) - G-invariant diffusion maps [11.852406625172216]
We derive diffusion maps that intrinsically account for the group action on the data.
In particular, we construct both equivariant and invariant embeddings, which can be used to cluster and align the data points.
arXiv Detail & Related papers (2023-06-12T18:16:33Z) - The G-invariant graph Laplacian [12.676094208872842]
We consider data sets whose data points lie on a manifold that is closed under the action of a known unitary Lie group G.
We construct the graph Laplacian by incorporating the distances between all the pairs of points generated by the action of G on the data set.
We show that the G-GL converges to the Laplace-Beltrami operator on the data manifold, while enjoying a significantly improved convergence rate.
arXiv Detail & Related papers (2023-03-29T20:07:07Z) - Matrix logistic map: fractal spectral distributions and transfer of
chaos [0.0]
We show that for an arbitrary initial ensemble of hermitian random matrices with a continuous level density supported on the interval $[0,1]$, the level density converges to the invariant measure of the logistic map.
This approach generalizes the known model of coupled logistic maps, and allows us to study the transition to chaos in complex networks and multidimensional systems.
arXiv Detail & Related papers (2023-03-10T19:19:56Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Algebraic Aspects of Boundaries in the Kitaev Quantum Double Model [77.34726150561087]
We provide a systematic treatment of boundaries based on subgroups $Ksubseteq G$ with the Kitaev quantum double $D(G)$ model in the bulk.
The boundary sites are representations of a $*$-subalgebra $Xisubseteq D(G)$ and we explicate its structure as a strong $*$-quasi-Hopf algebra.
As an application of our treatment, we study patches with boundaries based on $K=G$ horizontally and $K=e$ vertically and show how these could be used in a quantum computer
arXiv Detail & Related papers (2022-08-12T15:05:07Z) - Hybrid Model-based / Data-driven Graph Transform for Image Coding [54.31406300524195]
We present a hybrid model-based / data-driven approach to encode an intra-prediction residual block.
The first $K$ eigenvectors of a transform matrix are derived from a statistical model, e.g., the asymmetric discrete sine transform (ADST) for stability.
Using WebP as a baseline image, experimental results show that our hybrid graph transform achieved better energy compaction than default discrete cosine transform (DCT) and better stability than KLT.
arXiv Detail & Related papers (2022-03-02T15:36:44Z) - A Practical Method for Constructing Equivariant Multilayer Perceptrons
for Arbitrary Matrix Groups [115.58550697886987]
We provide a completely general algorithm for solving for the equivariant layers of matrix groups.
In addition to recovering solutions from other works as special cases, we construct multilayer perceptrons equivariant to multiple groups that have never been tackled before.
Our approach outperforms non-equivariant baselines, with applications to particle physics and dynamical systems.
arXiv Detail & Related papers (2021-04-19T17:21:54Z) - Linear Time Sinkhorn Divergences using Positive Features [51.50788603386766]
Solving optimal transport with an entropic regularization requires computing a $ntimes n$ kernel matrix that is repeatedly applied to a vector.
We propose to use instead ground costs of the form $c(x,y)=-logdotpvarphi(x)varphi(y)$ where $varphi$ is a map from the ground space onto the positive orthant $RRr_+$, with $rll n$.
arXiv Detail & Related papers (2020-06-12T10:21:40Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.