Sheaf Hypergraph Networks
- URL: http://arxiv.org/abs/2309.17116v1
- Date: Fri, 29 Sep 2023 10:25:43 GMT
- Title: Sheaf Hypergraph Networks
- Authors: Iulia Duta, Giulia Cassar\`a, Fabrizio Silvestri, Pietro Li\`o
- Abstract summary: We introduce cellular sheaves for hypergraphs, a mathematical construction that adds extra structure to the conventional hypergraph.
Drawing inspiration from existing Laplacians in the literature, we develop two unique formulations of sheaf hypergraph Laplacians.
We employ these sheaf hypergraph Laplacians to design two categories of models: Sheaf Hypergraph Neural Networks and Sheaf Hypergraph Convolutional Networks.
- Score: 10.785525697855498
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Higher-order relations are widespread in nature, with numerous phenomena
involving complex interactions that extend beyond simple pairwise connections.
As a result, advancements in higher-order processing can accelerate the growth
of various fields requiring structured data. Current approaches typically
represent these interactions using hypergraphs. We enhance this representation
by introducing cellular sheaves for hypergraphs, a mathematical construction
that adds extra structure to the conventional hypergraph while maintaining
their local, higherorder connectivity. Drawing inspiration from existing
Laplacians in the literature, we develop two unique formulations of sheaf
hypergraph Laplacians: linear and non-linear. Our theoretical analysis
demonstrates that incorporating sheaves into the hypergraph Laplacian provides
a more expressive inductive bias than standard hypergraph diffusion, creating a
powerful instrument for effectively modelling complex data structures. We
employ these sheaf hypergraph Laplacians to design two categories of models:
Sheaf Hypergraph Neural Networks and Sheaf Hypergraph Convolutional Networks.
These models generalize classical Hypergraph Networks often found in the
literature. Through extensive experimentation, we show that this generalization
significantly improves performance, achieving top results on multiple benchmark
datasets for hypergraph node classification.
Related papers
- SPHINX: Structural Prediction using Hypergraph Inference Network [19.853413818941608]
We introduce Structural Prediction using Hypergraph Inference Network (SPHINX), a model that learns to infer a latent hypergraph structure in an unsupervised way.
We show that the recent advancement in k-subset sampling represents a suitable tool for producing discrete hypergraph structures.
The resulting model can generate the higher-order structure necessary for any modern hypergraph neural network.
arXiv Detail & Related papers (2024-10-04T07:49:57Z) - HYGENE: A Diffusion-based Hypergraph Generation Method [6.997955138726617]
We introduce a diffusion-based Hypergraph Generation (HYGENE) method that addresses challenges through a progressive local expansion approach.
Experiments demonstrated the effectiveness of HYGENE, proving its ability to closely mimic a variety of properties in hypergraphs.
arXiv Detail & Related papers (2024-08-29T11:45:01Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Message Passing Neural Networks for Hypergraphs [6.999112784624749]
We present the first graph neural network based on message passing capable of processing hypergraph-structured data.
We show that the proposed model defines a design space for neural network models for hypergraphs, thus generalizing existing models for hypergraphs.
arXiv Detail & Related papers (2022-03-31T12:38:22Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.