Hypergraph Pre-training with Graph Neural Networks
- URL: http://arxiv.org/abs/2105.10862v1
- Date: Sun, 23 May 2021 06:33:57 GMT
- Title: Hypergraph Pre-training with Graph Neural Networks
- Authors: Boxin Du, Changhe Yuan, Robert Barton, Tal Neiman, Hanghang Tong
- Abstract summary: This paper presents an end-to-end, bi-level pre-training strategy with Graph Neural Networks for hypergraphs.
The proposed framework named HyperGene bears three distinctive advantages.
It is capable of ingesting the labeling information when available, but more importantly, it is mainly designed in the self-supervised fashion which significantly broadens its applicability.
- Score: 30.768860573214102
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the prevalence of hypergraphs in a variety of high-impact
applications, there are relatively few works on hypergraph representation
learning, most of which primarily focus on hyperlink prediction, often
restricted to the transductive learning setting. Among others, a major hurdle
for effective hypergraph representation learning lies in the label scarcity of
nodes and/or hyperedges. To address this issue, this paper presents an
end-to-end, bi-level pre-training strategy with Graph Neural Networks for
hypergraphs. The proposed framework named HyperGene bears three distinctive
advantages. First, it is capable of ingesting the labeling information when
available, but more importantly, it is mainly designed in the self-supervised
fashion which significantly broadens its applicability. Second, at the heart of
the proposed HyperGene are two carefully designed pretexts, one on the node
level and the other on the hyperedge level, which enable us to encode both the
local and the global context in a mutually complementary way. Third, the
proposed framework can work in both transductive and inductive settings. When
applying the two proposed pretexts in tandem, it can accelerate the adaptation
of the knowledge from the pre-trained model to downstream applications in the
transductive setting, thanks to the bi-level nature of the proposed method. The
extensive experimental results demonstrate that: (1) HyperGene achieves up to
5.69% improvements in hyperedge classification, and (2) improves pre-training
efficiency by up to 42.80% on average.
Related papers
- HYGENE: A Diffusion-based Hypergraph Generation Method [6.997955138726617]
We introduce a diffusion-based Hypergraph Generation (HYGENE) method that addresses challenges through a progressive local expansion approach.
Experiments demonstrated the effectiveness of HYGENE, proving its ability to closely mimic a variety of properties in hypergraphs.
arXiv Detail & Related papers (2024-08-29T11:45:01Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks [9.987252149421982]
We present a novel self-supervised pretraining framework for heterogeneous HyperGNNs.
Our method is able to effectively capture higher-order relations among entities in the data in a self-supervised manner.
Our experiments show that our proposed framework consistently outperforms state-of-the-art baselines in various downstream tasks.
arXiv Detail & Related papers (2023-11-19T16:34:56Z) - Enhancing Hyperedge Prediction with Context-Aware Self-Supervised
Learning [64.46188414653204]
We propose a novel hyperedge prediction framework (CASH)
CASH employs context-aware node aggregation to capture complex relations among nodes in each hyperedge for (C1) and (2) self-supervised contrastive learning in the context of hyperedge prediction to enhance hypergraph representations for (C2)
Experiments on six real-world hypergraphs reveal that CASH consistently outperforms all competing methods in terms of the accuracy in hyperedge prediction.
arXiv Detail & Related papers (2023-09-11T20:06:00Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - DualHGNN: A Dual Hypergraph Neural Network for Semi-Supervised Node
Classification based on Multi-View Learning and Density Awareness [3.698434507617248]
Graph-based semi-supervised node classification has been shown to become a state-of-the-art approach in many applications with high research value and significance.
This paper proposes the Dual Hypergraph Neural Network (DualHGNN), a new dual connection model integrating both hypergraph structure learning and hypergraph representation learning simultaneously in a unified architecture.
arXiv Detail & Related papers (2023-06-07T07:40:04Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - HNHN: Hypergraph Networks with Hyperedge Neurons [90.15253035487314]
HNHN is a hypergraph convolution network with nonlinear activation functions applied to both hypernodes and hyperedges.
We demonstrate improved performance of HNHN in both classification accuracy and speed on real world datasets when compared to state of the art methods.
arXiv Detail & Related papers (2020-06-22T14:08:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.