Neural Networks on Symmetric Spaces of Noncompact Type
- URL: http://arxiv.org/abs/2601.01097v1
- Date: Sat, 03 Jan 2026 07:26:39 GMT
- Title: Neural Networks on Symmetric Spaces of Noncompact Type
- Authors: Xuan Son Nguyen, Shuo Yang, Aymeric Histace,
- Abstract summary: We propose a novel approach for developing neural networks on hyperbolic spaces.<n>Our approach is validated on challenging benchmarks for image classification, electroencephalogram (EEG) signal classification, image generation, and natural language inference.
- Score: 19.41181017140696
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent works have demonstrated promising performances of neural networks on hyperbolic spaces and symmetric positive definite (SPD) manifolds. These spaces belong to a family of Riemannian manifolds referred to as symmetric spaces of noncompact type. In this paper, we propose a novel approach for developing neural networks on such spaces. Our approach relies on a unified formulation of the distance from a point to a hyperplane on the considered spaces. We show that some existing formulations of the point-to-hyperplane distance can be recovered by our approach under specific settings. Furthermore, we derive a closed-form expression for the point-to-hyperplane distance in higher-rank symmetric spaces of noncompact type equipped with G-invariant Riemannian metrics. The derived distance then serves as a tool to design fully-connected (FC) layers and an attention mechanism for neural networks on the considered spaces. Our approach is validated on challenging benchmarks for image classification, electroencephalogram (EEG) signal classification, image generation, and natural language inference.
Related papers
- Multivariate Time Series Forecasting with Hybrid Euclidean-SPD Manifold Graph Neural Networks [31.893767537160258]
We propose a graph neural network-based model that captures data geometry within a hybridean-Riemannian framework.<n>HSMGNN achieves up to a 13.8 percent improvement over state-of-the-art baselines in forecasting accuracy.
arXiv Detail & Related papers (2025-12-16T02:42:03Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Non-Euclidean Spatial Graph Neural Network [13.569970309961777]
A novel message-passing-based neural network is proposed to combine graph topology and spatial geometry.
We theoretically guarantee that the learned representations are provably invariant to important symmetries such as rotation or translation.
arXiv Detail & Related papers (2023-12-17T20:21:33Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - On-Manifold Projected Gradient Descent [0.0]
This work provides a computable, direct, and mathematically rigorous approximation to the differential geometry of class manifold for high-dimensional data.
Tools are applied to the setting of neural network image classifiers, where we generate novel, on-manifold data samples.
arXiv Detail & Related papers (2023-08-23T17:50:50Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Fully-Connected Network on Noncompact Symmetric Space and Ridgelet
Transform based on Helgason-Fourier Analysis [10.05944106581306]
We present a fully-connected network and its associated ridgelet transform on the noncompact symmetric space.
The ridgelet transform is an analysis operator of a depth-2 continuous network spanned by neurons.
Thanks to the coordinate-free reformulation, the role of nonlinear activation functions is revealed to be a wavelet function.
arXiv Detail & Related papers (2022-03-03T10:45:53Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN
Design [8.250374560598493]
Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently.
The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space.
We present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space.
arXiv Detail & Related papers (2021-12-03T03:20:27Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z) - Neural Operator: Graph Kernel Network for Partial Differential Equations [57.90284928158383]
This work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators)
We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators.
Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
arXiv Detail & Related papers (2020-03-07T01:56:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.