HYGENE: A Diffusion-based Hypergraph Generation Method
- URL: http://arxiv.org/abs/2408.16457v2
- Date: Mon, 21 Oct 2024 08:47:29 GMT
- Title: HYGENE: A Diffusion-based Hypergraph Generation Method
- Authors: Dorian Gailhard, Enzo Tartaglione, Lirida Naviner, Jhony H. Giraldo,
- Abstract summary: We introduce a diffusion-based Hypergraph Generation (HYGENE) method that addresses challenges through a progressive local expansion approach.
Experiments demonstrated the effectiveness of HYGENE, proving its ability to closely mimic a variety of properties in hypergraphs.
- Score: 6.997955138726617
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hypergraphs are powerful mathematical structures that can model complex, high-order relationships in various domains, including social networks, bioinformatics, and recommender systems. However, generating realistic and diverse hypergraphs remains challenging due to their inherent complexity and lack of effective generative models. In this paper, we introduce a diffusion-based Hypergraph Generation (HYGENE) method that addresses these challenges through a progressive local expansion approach. HYGENE works on the bipartite representation of hypergraphs, starting with a single pair of connected nodes and iteratively expanding it to form the target hypergraph. At each step, nodes and hyperedges are added in a localized manner using a denoising diffusion process, which allows for the construction of the global structure before refining local details. Our experiments demonstrated the effectiveness of HYGENE, proving its ability to closely mimic a variety of properties in hypergraphs. To the best of our knowledge, this is the first attempt to employ deep learning models for hypergraph generation, and our work aims to lay the groundwork for future research in this area.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Sheaf Hypergraph Networks [10.785525697855498]
We introduce cellular sheaves for hypergraphs, a mathematical construction that adds extra structure to the conventional hypergraph.
Drawing inspiration from existing Laplacians in the literature, we develop two unique formulations of sheaf hypergraph Laplacians.
We employ these sheaf hypergraph Laplacians to design two categories of models: Sheaf Hypergraph Neural Networks and Sheaf Hypergraph Convolutional Networks.
arXiv Detail & Related papers (2023-09-29T10:25:43Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - Learning Multi-Granular Hypergraphs for Video-Based Person
Re-Identification [110.52328716130022]
Video-based person re-identification (re-ID) is an important research topic in computer vision.
We propose a novel graph-based framework, namely Multi-Granular Hypergraph (MGH) to better representational capabilities.
90.0% top-1 accuracy on MARS is achieved using MGH, outperforming the state-of-the-arts schemes.
arXiv Detail & Related papers (2021-04-30T11:20:02Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.