RiemannGL: Riemannian Geometry Changes Graph Deep Learning
- URL: http://arxiv.org/abs/2602.10982v1
- Date: Wed, 11 Feb 2026 16:10:53 GMT
- Title: RiemannGL: Riemannian Geometry Changes Graph Deep Learning
- Authors: Li Sun, Qiqi Wan, Suyang Zhou, Zhenhao Huang, Philip S. Yu,
- Abstract summary: Graphs are ubiquitous, and learning on graphs has become a cornerstone in artificial intelligence and data mining communities.<n>This paper argues that Riemannian geometry provides a principled and necessary foundation for graph representation learning.<n>We contend that the central mission of Riemannian graph learning is to endow graph neural networks with intrinsic manifold structures.
- Score: 42.90386246551942
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs are ubiquitous, and learning on graphs has become a cornerstone in artificial intelligence and data mining communities. Unlike pixel grids in images or sequential structures in language, graphs exhibit a typical non-Euclidean structure with complex interactions among the objects. This paper argues that Riemannian geometry provides a principled and necessary foundation for graph representation learning, and that Riemannian graph learning should be viewed as a unifying paradigm rather than a collection of isolated techniques. While recent studies have explored the integration of graph learning and Riemannian geometry, most existing approaches are limited to a narrow class of manifolds, particularly hyperbolic spaces, and often adopt extrinsic manifold formulations. We contend that the central mission of Riemannian graph learning is to endow graph neural networks with intrinsic manifold structures, which remains underexplored. To advance this perspective, we identify key conceptual and methodological gaps in existing approaches and outline a structured research agenda along three dimensions: manifold type, neural architecture, and learning paradigm. We further discuss open challenges, theoretical foundations, and promising directions that are critical for unlocking the full potential of Riemannian graph learning. This paper aims to provide a coherent viewpoint and to stimulate broader exploration of Riemannian geometry as a foundational framework for future graph learning research.
Related papers
- Multi-Domain Riemannian Graph Gluing for Building Graph Foundation Models [43.64910777659052]
Multi-domain graph pre-training integrates knowledge from diverse domains to enhance performance in the target domains.<n>Existing solutions often fall short of answering a fundamental question: how is knowledge integrated or transferred across domains.<n>We present the GraphGlue framework, which supports batched pre-training with EMA prototyping and provides a transferability measure based on geometric consistence.
arXiv Detail & Related papers (2026-02-28T12:22:19Z) - A Remedy for Over-Squashing in Graph Learning via Forman-Ricci Curvature based Graph-to-Hypergraph Structural Lifting [0.0]
We propose a structural lifting strategy using Forman-Ricci curvature, which defines an edge-based network characteristic.<n>Curvature reveals local and global properties of a graph, such as a network's backbones.<n>Our approach provides a remedy to the problem of information distortion in message passing across long distances and graph bottlenecks.
arXiv Detail & Related papers (2025-08-15T10:46:27Z) - RiemannGFM: Learning a Graph Foundation Model from Riemannian Geometry [19.299795173943476]
Graph neural networks excel at learning graph data, the omnipresent non-Euclidean structure, but often lack the generalization capacity.<n>Recent efforts have been made to leverage Large Language Models.<n>Key innovation is the discovery of a simple yet effective structural vocabulary of trees and cycles.
arXiv Detail & Related papers (2025-02-05T15:06:09Z) - Foundations and Frontiers of Graph Learning Theory [81.39078977407719]
Recent advancements in graph learning have revolutionized the way to understand and analyze data with complex structures.
Graph Neural Networks (GNNs), i.e. neural network architectures designed for learning graph representations, have become a popular paradigm.
This article provides a comprehensive summary of the theoretical foundations and breakthroughs concerning the approximation and learning behaviors intrinsic to prevalent graph learning models.
arXiv Detail & Related papers (2024-07-03T14:07:41Z) - Graph Foundation Models: Concepts, Opportunities and Challenges [66.37994863159861]
Foundation models have emerged as critical components in a variety of artificial intelligence applications.<n>The capabilities of foundation models in generalization and adaptation motivate graph machine learning researchers to discuss the potential of developing a new graph learning paradigm.<n>This article introduces the concept of Graph Foundation Models (GFMs), and offers an exhaustive explanation of their key characteristics and underlying technologies.
arXiv Detail & Related papers (2023-10-18T09:31:21Z) - Contrastive Graph Clustering in Curvature Spaces [74.03252813800334]
We present a novel end-to-end contrastive graph clustering model named CONGREGATE.
To support geometric clustering, we construct a theoretically grounded Heterogeneous Curvature Space.
We then train the graph clusters by an augmentation-free reweighted contrastive approach.
arXiv Detail & Related papers (2023-05-05T14:04:52Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [61.49208407567829]
This survey paper provides a comprehensive review of the rapidly evolving field of Hyperbolic Graph Learning (HGL)<n>We systematically categorize and analyze existing methods dividing them into (1) hyperbolic graph embedding-based techniques, (2) graph neural network-based hyperbolic models, and (3) emerging paradigms.<n>We extensively discuss diverse applications of HGL across multiple domains, including recommender systems, knowledge graphs, bioinformatics, and other relevant scenarios.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - A Self-supervised Mixed-curvature Graph Neural Network [76.3790248465522]
We present a novel Self-supervised Mixed-curvature Graph Neural Network (SelfMGNN)
We show that SelfMGNN captures the complicated graph structures in reality and outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2021-12-10T08:56:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.