Building Neural Networks on Matrix Manifolds: A Gyrovector Space
Approach
- URL: http://arxiv.org/abs/2305.04560v3
- Date: Mon, 5 Jun 2023 08:14:56 GMT
- Title: Building Neural Networks on Matrix Manifolds: A Gyrovector Space
Approach
- Authors: Xuan Son Nguyen, Shuo Yang
- Abstract summary: We propose new models and layers for building neural networks on SPD and Grassmann manifold.
We show the effectiveness of our approach in two applications, i.e., human action recognition and knowledge graph completion.
- Score: 8.003578990152945
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Matrix manifolds, such as manifolds of Symmetric Positive Definite (SPD)
matrices and Grassmann manifolds, appear in many applications. Recently, by
applying the theory of gyrogroups and gyrovector spaces that is a powerful
framework for studying hyperbolic geometry, some works have attempted to build
principled generalizations of Euclidean neural networks on matrix manifolds.
However, due to the lack of many concepts in gyrovector spaces for the
considered manifolds, e.g., the inner product and gyroangles, techniques and
mathematical tools provided by these works are still limited compared to those
developed for studying hyperbolic geometry. In this paper, we generalize some
notions in gyrovector spaces for SPD and Grassmann manifolds, and propose new
models and layers for building neural networks on these manifolds. We show the
effectiveness of our approach in two applications, i.e., human action
recognition and knowledge graph completion.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Matrix Manifold Neural Networks++ [18.385670036798707]
We design fully-connected layers for SPD neural networks.
We propose a method for performing backpropagation with the Grassmann logarithmic map in the projector perspective.
arXiv Detail & Related papers (2024-05-29T15:47:35Z) - A Lie Group Approach to Riemannian Batch Normalization [59.48083303101632]
This paper establishes a unified framework for normalization techniques on Lie groups.
We focus on Symmetric Positive Definite (SPD), which possess three distinct types of Lie group structures.
Specific normalization layers induced by these Lie groups are then proposed for SPD neural networks.
arXiv Detail & Related papers (2024-03-17T16:24:07Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Intrinsic Gaussian Vector Fields on Manifolds [40.20536208199638]
We provide primitives needed to deploy the resulting Hodge-Mat'ern Gaussian vector fields on the two-dimensional sphere and the hypertori.
We show that our Gaussian vector fields constitute considerably more refined inductive biases than the extrinsic fields proposed before.
arXiv Detail & Related papers (2023-10-28T21:17:36Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - Nested Grassmannians for Dimensionality Reduction with Applications [7.106986689736826]
We propose a novel framework for constructing a nested sequence of homogeneous Riemannian manifold.
We focus on applying the proposed framework to the Grassmann manifold, giving rise to the nested Grassmannians (NG)
Specifically, each planar (2D) shape can be represented as a point in the complex projective space which is a complex Grass-mann manifold.
With the proposed NG structure, we develop algorithms for the supervised and unsupervised dimensionality reduction problems respectively.
arXiv Detail & Related papers (2020-10-27T20:09:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.