Geometry-Preserving Neural Architectures on Manifolds with Boundary
- URL: http://arxiv.org/abs/2602.03082v1
- Date: Tue, 03 Feb 2026 04:09:39 GMT
- Title: Geometry-Preserving Neural Architectures on Manifolds with Boundary
- Authors: Karthik Elamvazhuthi, Shiba Biswal, Kian Rosenblum, Arushi Katyal, Tianli Qu, Grady Ma, Rishi Sonthalia,
- Abstract summary: We propose a class of geometry-aware architectures that interleave geometric updates between layers.<n>We establish universal approximation results for constrained neural ODEs.<n>Experiments on dynamics over S2 and SO(3), and diffusion on Sd-1-valued features demonstrate exact feasibility for analytic updates.
- Score: 3.6352834408416412
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Preserving geometric structure is important in learning. We propose a unified class of geometry-aware architectures that interleave geometric updates between layers, where both projection layers and intrinsic exponential map updates arise as discretizations of projected dynamical systems on manifolds (with or without boundary). Within this framework, we establish universal approximation results for constrained neural ODEs. We also analyze architectures that enforce geometry only at the output, proving a separate universal approximation property that enables direct comparison to interleaved designs. When the constraint set is unknown, we learn projections via small-time heat-kernel limits, showing diffusion/flow-matching can be used as data-based projections. Experiments on dynamics over S^2 and SO(3), and diffusion on S^{d-1}-valued features demonstrate exact feasibility for analytic updates and strong performance for learned projections
Related papers
- The Neural Differential Manifold: An Architecture with Explicit Geometric Structure [8.201374511929538]
This paper introduces the Neural Differential Manifold (NDM), a novel neural network architecture that explicitly incorporates geometric structure into its fundamental design.<n>We analyze the theoretical advantages of this approach, including its potential for more efficient optimization, enhanced continual learning, and applications in scientific discovery and controllable generative modeling.
arXiv Detail & Related papers (2025-10-29T02:24:27Z) - Geometry-Aware Spiking Graph Neural Network [24.920334588995072]
We propose a Geometry-Aware Spiking Graph Neural Network that unifies spike-based neural dynamics with adaptive representation learning.<n>Experiments on multiple benchmarks show that GSG achieves superior accuracy, robustness, and energy efficiency compared to both Euclidean SNNs and manifold-based GNNs.
arXiv Detail & Related papers (2025-08-09T02:52:38Z) - Learning Latent Graph Geometry via Fixed-Point Schrödinger-Type Activation: A Theoretical Study [1.1745324895296467]
We develop a unified theoretical framework for neural architectures with internal representations evolving as stationary states of dissipative Schr"odinger-type dynamics on learned latent graphs.<n>We prove existence, uniqueness, and smooth dependence of equilibria, and show that the dynamics are equivalent under the Bloch map to norm-preserving Landau--Lifshitz flows.<n>The resulting model class provides a compact, geometrically interpretable, and analytically tractable foundation for learning latent graph geometry via fixed-point Schr"odinger-type activations.
arXiv Detail & Related papers (2025-07-27T00:35:15Z) - Random Sparse Lifts: Construction, Analysis and Convergence of finite sparse networks [17.487761710665968]
We present a framework to define a large class of neural networks for which, by construction, training by gradient flow provably reaches arbitrarily low loss when the number of parameters grows.
arXiv Detail & Related papers (2025-01-10T12:52:00Z) - Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - E(n) Equivariant Topological Neural Networks [10.603892843083173]
Graph neural networks excel at modeling pairwise interactions, but they cannot flexibly accommodate higher-order interactions and features.<n>Topological deep learning (TDL) has emerged recently as a promising tool for addressing this issue.<n>This paper introduces E(n)-Equivariant Topological Neural Networks (ETNNs)<n>ETNNs incorporate geometric node features while respecting rotation, reflection, and translation.
arXiv Detail & Related papers (2024-05-24T10:55:38Z) - Equivariant Neural Operator Learning with Graphon Convolution [12.059797539633506]
We propose a general architecture that combines the learning coefficient scheme with a residual operator layer for learning mappings between continuous functions in the 3D Euclidean space.
arXiv Detail & Related papers (2023-11-17T23:28:22Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Inter-layer Transition in Neural Architecture Search [89.00449751022771]
The dependency between the architecture weights of connected edges is explicitly modeled in this paper.
Experiments on five benchmarks confirm the value of modeling inter-layer dependency and demonstrate the proposed method outperforms state-of-the-art methods.
arXiv Detail & Related papers (2020-11-30T03:33:52Z) - Mix Dimension in Poincar\'{e} Geometry for 3D Skeleton-based Action
Recognition [57.98278794950759]
Graph Convolutional Networks (GCNs) have already demonstrated their powerful ability to model the irregular data.
We present a novel spatial-temporal GCN architecture which is defined via the Poincar'e geometry.
We evaluate our method on two current largest scale 3D datasets.
arXiv Detail & Related papers (2020-07-30T18:23:18Z) - Understanding Graph Neural Networks with Generalized Geometric
Scattering Transforms [67.88675386638043]
The scattering transform is a multilayered wavelet-based deep learning architecture that acts as a model of convolutional neural networks.
We introduce windowed and non-windowed geometric scattering transforms for graphs based upon a very general class of asymmetric wavelets.
We show that these asymmetric graph scattering transforms have many of the same theoretical guarantees as their symmetric counterparts.
arXiv Detail & Related papers (2019-11-14T17:23:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.