Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks
- URL: http://arxiv.org/abs/2311.11368v1
- Date: Sun, 19 Nov 2023 16:34:56 GMT
- Title: Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks
- Authors: Abdalgader Abubaker, Takanori Maehara, Madhav Nimishakavi, Vassilis
Plachouras
- Abstract summary: We present a novel self-supervised pretraining framework for heterogeneous HyperGNNs.
Our method is able to effectively capture higher-order relations among entities in the data in a self-supervised manner.
Our experiments show that our proposed framework consistently outperforms state-of-the-art baselines in various downstream tasks.
- Score: 9.987252149421982
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, pretraining methods for the Graph Neural Networks (GNNs) have been
successful at learning effective representations from unlabeled graph data.
However, most of these methods rely on pairwise relations in the graph and do
not capture the underling higher-order relations between entities. Hypergraphs
are versatile and expressive structures that can effectively model higher-order
relationships among entities in the data. Despite the efforts to adapt GNNs to
hypergraphs (HyperGNN), there are currently no fully self-supervised
pretraining methods for HyperGNN on heterogeneous hypergraphs. In this paper,
we present SPHH, a novel self-supervised pretraining framework for
heterogeneous HyperGNNs. Our method is able to effectively capture higher-order
relations among entities in the data in a self-supervised manner. SPHH is
consist of two self-supervised pretraining tasks that aim to simultaneously
learn both local and global representations of the entities in the hypergraph
by using informative representations derived from the hypergraph structure.
Overall, our work presents a significant advancement in the field of
self-supervised pretraining of HyperGNNs, and has the potential to improve the
performance of various graph-based downstream tasks such as node classification
and link prediction tasks which are mapped to hypergraph configuration. Our
experiments on two real-world benchmarks using four different HyperGNN models
show that our proposed SPHH framework consistently outperforms state-of-the-art
baselines in various downstream tasks. The results demonstrate that SPHH is
able to improve the performance of various HyperGNN models in various
downstream tasks, regardless of their architecture or complexity, which
highlights the robustness of our framework.
Related papers
- Federated Hypergraph Learning with Hyperedge Completion [6.295242666794106]
Hypergraph neural networks enhance conventional graph neural networks by capturing high-order relationships among nodes.
We propose FedHGN, a novel algorithm for federated hypergraph learning.
Our algorithm utilizes subgraphs of a hypergraph stored on distributed devices to train local HGNN models in a federated manner.
arXiv Detail & Related papers (2024-08-09T16:31:41Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-12T02:07:07Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-10T12:37:55Z) - UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks [8.777765815864367]
Hypergraph, an expressive structure with flexibility to model the higher-order correlations among entities, has recently attracted increasing attention from various research domains.
We propose UniGNN, a unified framework for interpreting the message passing process in graph and hypergraph neural networks.
Experiments have been conducted to demonstrate the effectiveness of UniGNN on multiple real-world datasets.
arXiv Detail & Related papers (2021-05-03T15:48:34Z) - Residual Enhanced Multi-Hypergraph Neural Network [26.42547421121713]
HyperGraph Neural Network (HGNN) is the de-facto method for hypergraph representation learning.
We propose the Residual enhanced Multi-Hypergraph Neural Network, which can fuse multi-modal information from each hypergraph effectively.
arXiv Detail & Related papers (2021-05-02T14:53:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.