Turning Tabular Foundation Models into Graph Foundation Models
- URL: http://arxiv.org/abs/2508.20906v2
- Date: Tue, 23 Sep 2025 17:49:14 GMT
- Title: Turning Tabular Foundation Models into Graph Foundation Models
- Authors: Dmitry Eremeev, Gleb Bazhenov, Oleg Platonov, Artem Babenko, Liudmila Prokhorenkova,
- Abstract summary: We propose G2T-FM, a framework for turning tabular foundation models into graph foundation models.<n>G2T-FM augments the original node features with neighborhood feature aggregation, adds structural embeddings, and then applies a TFM to the constructed node representations.<n>Our model achieves strong results, significantly outperforming publicly available GFMs and performing competitively with, and often better than, well-tuned GNNs trained from scratch.
- Score: 27.47522328312435
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While foundation models have revolutionized such fields as natural language processing and computer vision, their potential in graph machine learning remains largely unexplored. One of the key challenges in designing graph foundation models (GFMs) is handling diverse node features that can vary across different graph datasets. While many works on GFMs have focused exclusively on text-attributed graphs, the problem of handling arbitrary features of other types in GFMs has not been fully addressed. However, this problem is not unique to the graph domain, as it also arises in the field of machine learning for tabular data. In this work, motivated by the recent success of tabular foundation models (TFMs) like TabPFNv2 or LimiX, we propose G2T-FM, a simple framework for turning tabular foundation models into graph foundation models. Specifically, G2T-FM augments the original node features with neighborhood feature aggregation, adds structural embeddings, and then applies a TFM to the constructed node representations. Even in a fully in-context regime, our model achieves strong results, significantly outperforming publicly available GFMs and performing competitively with, and often better than, well-tuned GNNs trained from scratch. Moreover, after finetuning, G2T-FM surpasses well-tuned GNN baselines. In particular, when combined with LimiX, G2T-FM often outperforms the best GNN by a significant margin. In summary, our paper reveals the potential of a previously overlooked direction of utilizing tabular foundation models for graph machine learning tasks.
Related papers
- Tabular Foundation Models are Strong Graph Anomaly Detectors [18.257503243010436]
Graph anomaly detection (GAD) aims to identify abnormal nodes that deviate from the majority.<n>Existing GAD methods follow a "one model per dataset" paradigm.<n>This calls for a foundation model that enables a "one-for-all" GAD solution.
arXiv Detail & Related papers (2026-01-24T04:19:45Z) - GILT: An LLM-Free, Tuning-Free Graph Foundational Model for In-Context Learning [50.40400074353263]
Graph Neural Networks (GNNs) are powerful tools for precessing relational data but often struggle to generalize to unseen graphs.<n>We introduce textbfGraph textbfIn-context textbfL textbfTransformer (GILT), a framework built on an LLM-free and tuning-free architecture.
arXiv Detail & Related papers (2025-10-06T08:09:15Z) - GraphPFN: A Prior-Data Fitted Graph Foundation Model [27.47522328312435]
Foundation models pretrained on large-scale datasets have transformed such fields as natural language processing and computer vision.<n>We propose GraphPFN: a prior-data fitted network for node-level prediction.<n>On diverse real-world graph datasets with up to 50,000 nodes, GraphPFN shows strong in-context learning performance and state-of-the-art results after finetuning.
arXiv Detail & Related papers (2025-09-25T19:47:49Z) - Bringing Graphs to the Table: Zero-shot Node Classification via Tabular Foundation Models [13.832068659130705]
We introduce TAG, a graph learning approach that first converts a graph into a table via feature and structural encoders, applies multiple TFMs to diversely subsampled tables, and then aggregates their outputs through ensemble selection.<n>Experiments on 28 real-world datasets demonstrate that TAG consistently improves upon task-specific GNNs and state-of-the-art GFMs.
arXiv Detail & Related papers (2025-09-08T18:48:26Z) - Scalable Graph Generative Modeling via Substructure Sequences [50.32639806800683]
We introduce Generative Graph Pattern Machine (G$2$PM), a generative Transformer pre-training framework for graphs.<n>G$2$PM represents graph instances (nodes, edges, or entire graphs) as sequences of substructures.<n>It employs generative pre-training over the sequences to learn generalizable and transferable representations.
arXiv Detail & Related papers (2025-05-22T02:16:34Z) - GFM-RAG: Graph Foundation Model for Retrieval Augmented Generation [83.72561905487447]
We introduce GFM-RAG, a novel graph foundation model (GFM) for retrieval augmented generation.<n>GFM-RAG is powered by an innovative graph neural network that reasons over graph structure to capture complex query-knowledge relationships.<n>It achieves state-of-the-art performance while maintaining efficiency and alignment with neural scaling laws.
arXiv Detail & Related papers (2025-02-03T07:04:29Z) - GFT: Graph Foundation Model with Transferable Tree Vocabulary [52.17804507458509]
We propose a cross-task, cross-domain graph foundation model named GFT, short for Graph Foundation model with transferable Tree vocabulary.
By treating computation trees as tokens within the transferable vocabulary, GFT improves model generalization and reduces the risk of negative transfer.
The theoretical analyses and extensive experimental studies have demonstrated the transferability of computation trees and shown the effectiveness of GFT across diverse tasks and domains in graph learning.
arXiv Detail & Related papers (2024-11-09T05:14:30Z) - LangGFM: A Large Language Model Alone Can be a Powerful Graph Foundation Model [27.047809869136458]
Graph foundation models (GFMs) have recently gained significant attention.
Current research tends to focus on specific subsets of graph learning tasks.
We propose GFMBench-a systematic and comprehensive benchmark comprising 26 datasets.
We also introduce LangGFM, a novel GFM that relies entirely on large language models.
arXiv Detail & Related papers (2024-10-19T03:27:19Z) - Position: Graph Foundation Models are Already Here [53.737868336014735]
Graph Foundation Models (GFMs) are emerging as a significant research topic in the graph domain.
We propose a novel perspective for the GFM development by advocating for a graph vocabulary''
This perspective can potentially advance the future GFM design in line with the neural scaling laws.
arXiv Detail & Related papers (2024-02-03T17:24:36Z) - Graph Foundation Models: Concepts, Opportunities and Challenges [66.37994863159861]
Foundation models have emerged as critical components in a variety of artificial intelligence applications.<n>The capabilities of foundation models in generalization and adaptation motivate graph machine learning researchers to discuss the potential of developing a new graph learning paradigm.<n>This article introduces the concept of Graph Foundation Models (GFMs), and offers an exhaustive explanation of their key characteristics and underlying technologies.
arXiv Detail & Related papers (2023-10-18T09:31:21Z) - FIMP: Foundation Model-Informed Message Passing for Graph Neural Networks [36.648927429221466]
Foundation-Informed Message Passing (FIMP) is a Graph Neural Network (GNN) message-passing framework.
We show that the self-attention layers of foundation models can effectively be repurposed on graphs to perform cross-node attention-based message-passing.
arXiv Detail & Related papers (2022-10-17T23:44:30Z) - Graph Generative Model for Benchmarking Graph Neural Networks [73.11514658000547]
We introduce a novel graph generative model that learns and reproduces the distribution of real-world graphs in a privacy-controlled way.
Our model can successfully generate privacy-controlled, synthetic substitutes of large-scale real-world graphs that can be effectively used to benchmark GNN models.
arXiv Detail & Related papers (2022-07-10T06:42:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.