Multi-Domain Riemannian Graph Gluing for Building Graph Foundation Models
- URL: http://arxiv.org/abs/2603.00618v1
- Date: Sat, 28 Feb 2026 12:22:19 GMT
- Title: Multi-Domain Riemannian Graph Gluing for Building Graph Foundation Models
- Authors: Li Sun, Zhenhao Huang, Silei Chen, Lanxu Yang, Junda Ye, Sen Su, Philip S. Yu,
- Abstract summary: Multi-domain graph pre-training integrates knowledge from diverse domains to enhance performance in the target domains.<n>Existing solutions often fall short of answering a fundamental question: how is knowledge integrated or transferred across domains.<n>We present the GraphGlue framework, which supports batched pre-training with EMA prototyping and provides a transferability measure based on geometric consistence.
- Score: 43.64910777659052
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multi-domain graph pre-training integrates knowledge from diverse domains to enhance performance in the target domains, which is crucial for building graph foundation models. Despite initial success, existing solutions often fall short of answering a fundamental question: how is knowledge integrated or transferred across domains? This theoretical limitation motivates us to rethink the consistency and transferability between model pre-training and domain adaptation. In this paper, we propose a fresh Riemannian geometry perspective, whose core idea is to merge any graph dataset into a unified, smooth Riemannian manifold, enabling a systematic understanding of knowledge integration and transfer. To achieve this, our key contribution is the theoretical establishment of neural manifold gluing, which first characterizes local geometry using an adaptive orthogonal frame and then "glues" the local pieces together into a coherent whole. Building on this theory, we present the GraphGlue framework, which supports batched pre-training with EMA prototyping and provides a transferability measure based on geometric consistence. Extensive experiments demonstrate its superior performance across diverse graph domains. Moreover, we empirically validated GraphGlue's geometric scaling law, showing that larger quantities of datasets improve model transferability by producing a smoother manifold. Codes are available at https://github.com/RiemannGraph/GraphGlue.
Related papers
- G-reasoner: Foundation Models for Unified Reasoning over Graph-structured Knowledge [88.82814893945077]
Large language models (LLMs) excel at complex reasoning but remain limited by static and incomplete parametric knowledge.<n>Recent graph-enhanced RAG (GraphRAG) attempts to bridge this gap by constructing tailored graphs and enabling LLMs to reason on them.<n>G-reasoner is a unified framework that integrates graph and language foundation models for reasoning over diverse graph-structured knowledge.
arXiv Detail & Related papers (2025-09-29T04:38:12Z) - A Remedy for Over-Squashing in Graph Learning via Forman-Ricci Curvature based Graph-to-Hypergraph Structural Lifting [0.0]
We propose a structural lifting strategy using Forman-Ricci curvature, which defines an edge-based network characteristic.<n>Curvature reveals local and global properties of a graph, such as a network's backbones.<n>Our approach provides a remedy to the problem of information distortion in message passing across long distances and graph bottlenecks.
arXiv Detail & Related papers (2025-08-15T10:46:27Z) - Adaptive Riemannian Graph Neural Networks [29.859977834688625]
We introduce a novel framework that learns a continuous and anisotropic metric tensor field over the graph.<n>It allows each node to determine its optimal local geometry, enabling the model to fluidly adapt to the graph's structural landscape.<n>Our method demonstrates superior performance on both homophilic and heterophilic benchmark geometries.
arXiv Detail & Related papers (2025-08-04T16:55:02Z) - RiemannGFM: Learning a Graph Foundation Model from Riemannian Geometry [19.299795173943476]
Graph neural networks excel at learning graph data, the omnipresent non-Euclidean structure, but often lack the generalization capacity.<n>Recent efforts have been made to leverage Large Language Models.<n>Key innovation is the discovery of a simple yet effective structural vocabulary of trees and cycles.
arXiv Detail & Related papers (2025-02-05T15:06:09Z) - Multi-Domain Graph Foundation Models: Robust Knowledge Transfer via Topology Alignment [9.215549756572976]
Real-world graphs are often sparse and prone to noisy connections and adversarial attacks.<n>We propose the Multi-Domain Graph Foundation Model (MDGFM), a unified framework that aligns and leverages cross-domain topological information.<n>By aligning topologies, MDGFM not only improves multi-domain pre-training but also enables robust knowledge transfer to unseen domains.
arXiv Detail & Related papers (2025-02-04T05:09:23Z) - FMGNN: Fused Manifold Graph Neural Network [102.61136611255593]
Graph representation learning has been widely studied and demonstrated effectiveness in various graph tasks.
We propose the Fused Manifold Graph Neural Network (NN), a novel GNN architecture that embeds graphs into different Manifolds during training.
Our experiments demonstrate that NN yields superior performance over strong baselines on the benchmarks of node classification and link prediction tasks.
arXiv Detail & Related papers (2023-04-03T15:38:53Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [61.49208407567829]
This survey paper provides a comprehensive review of the rapidly evolving field of Hyperbolic Graph Learning (HGL)<n>We systematically categorize and analyze existing methods dividing them into (1) hyperbolic graph embedding-based techniques, (2) graph neural network-based hyperbolic models, and (3) emerging paradigms.<n>We extensively discuss diverse applications of HGL across multiple domains, including recommender systems, knowledge graphs, bioinformatics, and other relevant scenarios.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.