Nested Grassmannians for Dimensionality Reduction with Applications
- URL: http://arxiv.org/abs/2010.14589v3
- Date: Tue, 1 Mar 2022 10:33:55 GMT
- Title: Nested Grassmannians for Dimensionality Reduction with Applications
- Authors: Chun-Hao Yang, Baba C. Vemuri
- Abstract summary: We propose a novel framework for constructing a nested sequence of homogeneous Riemannian manifold.
We focus on applying the proposed framework to the Grassmann manifold, giving rise to the nested Grassmannians (NG)
Specifically, each planar (2D) shape can be represented as a point in the complex projective space which is a complex Grass-mann manifold.
With the proposed NG structure, we develop algorithms for the supervised and unsupervised dimensionality reduction problems respectively.
- Score: 7.106986689736826
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the recent past, nested structures in Riemannian manifolds has been
studied in the context of dimensionality reduction as an alternative to the
popular principal geodesic analysis (PGA) technique, for example, the principal
nested spheres. In this paper, we propose a novel framework for constructing a
nested sequence of homogeneous Riemannian manifolds. Common examples of
homogeneous Riemannian manifolds include the $n$-sphere, the Stiefel manifold,
the Grassmann manifold and many others. In particular, we focus on applying the
proposed framework to the Grassmann manifold, giving rise to the nested
Grassmannians (NG). An important application in which Grassmann manifolds are
encountered is planar shape analysis. Specifically, each planar (2D) shape can
be represented as a point in the complex projective space which is a complex
Grass-mann manifold. Some salient features of our framework are: (i) it
explicitly exploits the geometry of the homogeneous Riemannian manifolds and
(ii) the nested lower-dimensional submanifolds need not be geodesic. With the
proposed NG structure, we develop algorithms for the supervised and
unsupervised dimensionality reduction problems respectively. The proposed
algorithms are compared with PGA via simulation studies and real data
experiments and are shown to achieve a higher ratio of expressed variance
compared to PGA.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - A Geometric Insight into Equivariant Message Passing Neural Networks on
Riemannian Manifolds [1.0878040851638]
We argue that the metric attached to a coordinate-independent feature field should optimally preserve the principal bundle's original metric.
We obtain a message passing scheme on the manifold by discretizing the diffusion equation flow for a fixed time step.
The discretization of the higher-order diffusion process on a graph yields a new general class of equivariant GNN.
arXiv Detail & Related papers (2023-10-16T14:31:13Z) - Decentralized Riemannian Conjugate Gradient Method on the Stiefel
Manifold [59.73080197971106]
This paper presents a first-order conjugate optimization method that converges faster than the steepest descent method.
It aims to achieve global convergence over the Stiefel manifold.
arXiv Detail & Related papers (2023-08-21T08:02:16Z) - Building Neural Networks on Matrix Manifolds: A Gyrovector Space
Approach [8.003578990152945]
We propose new models and layers for building neural networks on SPD and Grassmann manifold.
We show the effectiveness of our approach in two applications, i.e., human action recognition and knowledge graph completion.
arXiv Detail & Related papers (2023-05-08T09:10:11Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - On Geometric Connections of Embedded and Quotient Geometries in
Riemannian Fixed-rank Matrix Optimization [5.876141028192136]
This paper proposes a general procedure for establishing the geometric landscape connections of a Riemannian optimization problem under the embedded and quotient geometries.
We observe an algorithmic connection between two geometries with some specific Riemannian metrics in fixed-rank matrix optimization.
Results provide a few new theoretical insights to unanswered questions in the literature.
arXiv Detail & Related papers (2021-10-23T03:13:56Z) - Semi-Riemannian Graph Convolutional Networks [36.09315878397234]
We develop a principled Semi-Riemannian GCN that first models data in semi-Riemannian manifold of constant nonzero curvature.
Our method provides a geometric inductive bias that is sufficiently flexible to model mixed heterogeneous topologies like hierarchical graphs with cycles.
arXiv Detail & Related papers (2021-06-06T14:23:34Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.