Bringing Graphs to the Table: Zero-shot Node Classification via Tabular Foundation Models
- URL: http://arxiv.org/abs/2509.07143v2
- Date: Wed, 08 Oct 2025 19:38:17 GMT
- Title: Bringing Graphs to the Table: Zero-shot Node Classification via Tabular Foundation Models
- Authors: Adrian Hayler, Xingyue Huang, İsmail İlkan Ceylan, Michael Bronstein, Ben Finkelshtein,
- Abstract summary: We introduce TAG, a graph learning approach that first converts a graph into a table via feature and structural encoders, applies multiple TFMs to diversely subsampled tables, and then aggregates their outputs through ensemble selection.<n>Experiments on 28 real-world datasets demonstrate that TAG consistently improves upon task-specific GNNs and state-of-the-art GFMs.
- Score: 13.832068659130705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph foundation models (GFMs) have recently emerged as a promising paradigm for achieving broad generalization across various graph data. However, existing GFMs are often trained on datasets that may not fully reflect real-world graphs, limiting their generalization performance. In contrast, tabular foundation models (TFMs) not only excel at classical tabular prediction tasks but have also shown strong applicability in other domains such as time series forecasting, natural language processing, and computer vision. Motivated by this, we take an alternative view to the standard perspective of GFMs and reformulate node classification as a tabular problem. In this reformulation, each node is represented as a row with feature, structure, and label information as columns, enabling TFMs to directly perform zero-shot node classification via in-context learning. In this work, we introduce TAG, a tabular approach for graph learning that first converts a graph into a table via feature and structural encoders, applies multiple TFMs to diversely subsampled tables, and then aggregates their outputs through ensemble selection. Experiments on 28 real-world datasets demonstrate that TAG consistently improves upon task-specific GNNs and state-of-the-art GFMs, highlighting the potential of the tabular reformulation for scalable and generalizable graph learning.
Related papers
- Can TabPFN Compete with GNNs for Node Classification via Graph Tabularization? [31.75541440214278]
We introduce TabPFN-GN, which transforms graph data into tabular features by extracting node attributes and structural properties.<n>Our experiments on 12 benchmark datasets reveal that TabPFN-GN achieves competitive performance with GNNs on homophilous graphs and consistently outperforms them on heterophilous graphs.
arXiv Detail & Related papers (2025-12-09T16:51:30Z) - GILT: An LLM-Free, Tuning-Free Graph Foundational Model for In-Context Learning [50.40400074353263]
Graph Neural Networks (GNNs) are powerful tools for precessing relational data but often struggle to generalize to unseen graphs.<n>We introduce textbfGraph textbfIn-context textbfL textbfTransformer (GILT), a framework built on an LLM-free and tuning-free architecture.
arXiv Detail & Related papers (2025-10-06T08:09:15Z) - Turning Tabular Foundation Models into Graph Foundation Models [27.47522328312435]
We propose a simple graph foundation model that employs TabPFNv2 as a backbone.<n>G2T-FM augments the original node features with neighborhood feature aggregation, adds structural embeddings, and then applies TabPFNv2 to the constructed node representations.
arXiv Detail & Related papers (2025-08-28T15:36:37Z) - BigCharts-R1: Enhanced Chart Reasoning with Visual Reinforcement Finetuning [51.472854950300416]
We propose BigCharts, a dataset creation pipeline that generates visually diverse chart images.<n>Unlike purely synthetic datasets, BigCharts incorporates real-world data, ensuring authenticity and visual diversity.<n>By introducing novel reward signals specifically designed for chart reasoning, our approach enhances model robustness and generalization.
arXiv Detail & Related papers (2025-08-13T13:39:17Z) - Deep Learning for School Dropout Detection: A Comparison of Tabular and Graph-Based Models for Predicting At-Risk Students [0.2029906424353094]
Student dropout is a significant challenge in educational systems worldwide.<n>Graph Neural Networks (GNNs) offer a potential advantage by capturing complex relationships inherent in student data if structured as graphs.
arXiv Detail & Related papers (2025-08-09T01:19:32Z) - From Features to Structure: Task-Aware Graph Construction for Relational and Tabular Learning with GNNs [6.0757501646966965]
We introduce auGraph, a unified framework for task-aware graph augmentation.<n> auGraph enhances base graph structures by selectively promoting attributes into nodes.<n>It preserves the original data schema while injecting task-relevant structural signal.
arXiv Detail & Related papers (2025-06-02T20:42:53Z) - Towards Graph Foundation Models: Learning Generalities Across Graphs via Task-Trees [50.78679002846741]
We propose a novel approach to cross-task generalization in graphs via task-trees.<n>We show that pretraining a graph neural network (GNN) on diverse task-trees with a reconstruction objective induces transferable knowledge.<n>This enables efficient adaptation to downstream tasks with minimal fine-tuning.
arXiv Detail & Related papers (2024-12-21T02:07:43Z) - GFT: Graph Foundation Model with Transferable Tree Vocabulary [52.17804507458509]
We propose a cross-task, cross-domain graph foundation model named GFT, short for Graph Foundation model with transferable Tree vocabulary.
By treating computation trees as tokens within the transferable vocabulary, GFT improves model generalization and reduces the risk of negative transfer.
The theoretical analyses and extensive experimental studies have demonstrated the transferability of computation trees and shown the effectiveness of GFT across diverse tasks and domains in graph learning.
arXiv Detail & Related papers (2024-11-09T05:14:30Z) - LangGFM: A Large Language Model Alone Can be a Powerful Graph Foundation Model [27.047809869136458]
Graph foundation models (GFMs) have recently gained significant attention.
Current research tends to focus on specific subsets of graph learning tasks.
We propose GFMBench-a systematic and comprehensive benchmark comprising 26 datasets.
We also introduce LangGFM, a novel GFM that relies entirely on large language models.
arXiv Detail & Related papers (2024-10-19T03:27:19Z) - Position: Graph Foundation Models are Already Here [53.737868336014735]
Graph Foundation Models (GFMs) are emerging as a significant research topic in the graph domain.
We propose a novel perspective for the GFM development by advocating for a graph vocabulary''
This perspective can potentially advance the future GFM design in line with the neural scaling laws.
arXiv Detail & Related papers (2024-02-03T17:24:36Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - Retrieving Complex Tables with Multi-Granular Graph Representation
Learning [20.72341939868327]
The task of natural language table retrieval seeks to retrieve semantically relevant tables based on natural language queries.
Existing learning systems treat tables as plain text based on the assumption that tables are structured as dataframes.
We propose Graph-based Table Retrieval (GTR), a generalizable NLTR framework with multi-granular graph representation learning.
arXiv Detail & Related papers (2021-05-04T20:19:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.