Modeling Hypergraph Using Large Language Models
- URL: http://arxiv.org/abs/2510.11728v1
- Date: Thu, 09 Oct 2025 04:23:16 GMT
- Title: Modeling Hypergraph Using Large Language Models
- Authors: Bingqiao Gu, Jiale Zeng, Xingqin Qi, Dong Li,
- Abstract summary: Hypergraphs are used in higher-order clustering, hypergraph neural networks and computer vision.<n>Yet, compared to traditional pairwise graphs, real hypergraph datasets remain scarce in both scale and diversity.<n>We introduce HyperLLM, a novel LLM-driven hypergraph generator that simulates the formation and evolution of hypergraphs through a multi-agent collaboration.
- Score: 4.58199980642666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to the advantages of hypergraphs in modeling high-order relationships in complex systems, they have been applied to higher-order clustering, hypergraph neural networks and computer vision. These applications rely heavily on access to high-quality, large-scale real-world hypergraph data. Yet, compared to traditional pairwise graphs, real hypergraph datasets remain scarce in both scale and diversity. This shortage significantly limits the development and evaluation of advanced hypergraph learning algorithms. Therefore, how to quickly generate large-scale hypergraphs that conform to the characteristics of real networks is a crucial task that has not received sufficient attention. Motivated by recent advances in large language models (LLMs), particularly their capabilities in semantic reasoning, structured generation, and simulating human behavior, we investigate whether LLMs can facilitate hypergraph generation from a fundamentally new perspective. We introduce HyperLLM, a novel LLM-driven hypergraph generator that simulates the formation and evolution of hypergraphs through a multi-agent collaboration. The framework integrates prompts and structural feedback mechanisms to ensure that the generated hypergraphs reflect key real-world patterns. Extensive experiments across diverse datasets demonstrate that HyperLLM achieves superior fidelity to structural and temporal hypergraph patterns, while requiring minimal statistical priors. Our findings suggest that LLM-based frameworks offer a promising new direction for hypergraph modeling.
Related papers
- Community and hyperedge inference in multiple hypergraphs [9.782518418521175]
We show how to utilize interconnections between multiple hypergraphs to synthesize integrated information from multiple higher-order systems.<n>We propose a model based on the block model, which integrates information from multiple hypergraphs to reveal latent high-order structures.<n>Our work provides a practical and flexible tool for analyzing multiple hypergraphs, greatly advancing the understanding of the organization in real-world high-order systems.
arXiv Detail & Related papers (2025-05-08T05:52:41Z) - Enhancing the Utility of Higher-Order Information in Relational Learning [0.9899763598214121]
We evaluate the effectiveness of hypergraph-level and graph-level architectures in relational learning.<n>We propose hypergraph-level encodings based on classical hypergraph characteristics.<n>Our theoretical analysis shows that hypergraph-level encodings provably increase the representational power of message-passing graph neural networks.
arXiv Detail & Related papers (2025-02-13T18:28:17Z) - LLM-Based Multi-Agent Systems are Scalable Graph Generative Models [73.28294528654885]
GraphAgent-Generator (GAG) is a novel simulation-based framework for dynamic, text-attributed social graph generation.<n>GAG simulates the temporal node and edge generation processes for zero-shot social graph generation.<n>The resulting graphs exhibit adherence to seven key macroscopic network properties, achieving an 11% improvement in microscopic graph structure metrics.
arXiv Detail & Related papers (2024-10-13T12:57:08Z) - Hyper-YOLO: When Visual Object Detection Meets Hypergraph Computation [74.65906322148997]
We introduce a new object detection method that integrates hypergraph computations to capture the complex high-order correlations among visual features.
Hyper-YOLO significantly outperforms the advanced YOLOv8-N and YOLOv9T with 12% $textval$ and 9% $APMoonLab improvements.
arXiv Detail & Related papers (2024-08-09T01:21:15Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Hypergraph-MLP: Learning on Hypergraphs without Message Passing [41.43504601820411]
Many hypergraph neural networks leverage message passing over hypergraph structures to enhance node representation learning.<n>We propose an alternative approach where we integrate the information about hypergraph structures into training supervision without explicit message passing.<n>Specifically, we introduce Hypergraph-MLP, a novel learning framework for hypergraph-structured data.
arXiv Detail & Related papers (2023-12-15T13:30:04Z) - Sheaf Hypergraph Networks [20.91851106383122]
We introduce cellular sheaves for hypergraphs, a mathematical construction that adds extra structure to the conventional hypergraph.<n> Drawing inspiration from existing Laplacians in the literature, we develop two unique formulations of sheaf hypergraph Laplacians.<n>We employ these sheaf hypergraph Laplacians to design two categories of models: Sheaf Hypergraph Neural Networks and Sheaf Hypergraph Convolutional Networks.
arXiv Detail & Related papers (2023-09-29T10:25:43Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.