UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks
- URL: http://arxiv.org/abs/2105.00956v1
- Date: Mon, 3 May 2021 15:48:34 GMT
- Title: UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks
- Authors: Jing Huang, Jie Yang
- Abstract summary: Hypergraph, an expressive structure with flexibility to model the higher-order correlations among entities, has recently attracted increasing attention from various research domains.
We propose UniGNN, a unified framework for interpreting the message passing process in graph and hypergraph neural networks.
Experiments have been conducted to demonstrate the effectiveness of UniGNN on multiple real-world datasets.
- Score: 8.777765815864367
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Hypergraph, an expressive structure with flexibility to model the
higher-order correlations among entities, has recently attracted increasing
attention from various research domains. Despite the success of Graph Neural
Networks (GNNs) for graph representation learning, how to adapt the powerful
GNN-variants directly into hypergraphs remains a challenging problem. In this
paper, we propose UniGNN, a unified framework for interpreting the message
passing process in graph and hypergraph neural networks, which can generalize
general GNN models into hypergraphs. In this framework, meticulously-designed
architectures aiming to deepen GNNs can also be incorporated into hypergraphs
with the least effort. Extensive experiments have been conducted to demonstrate
the effectiveness of UniGNN on multiple real-world datasets, which outperform
the state-of-the-art approaches with a large margin. Especially for the DBLP
dataset, we increase the accuracy from 77.4\% to 88.8\% in the semi-supervised
hypernode classification task. We further prove that the proposed
message-passing based UniGNN models are at most as powerful as the
1-dimensional Generalized Weisfeiler-Leman (1-GWL) algorithm in terms of
distinguishing non-isomorphic hypergraphs. Our code is available at
\url{https://github.com/OneForward/UniGNN}.
Related papers
- Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks [9.987252149421982]
We present a novel self-supervised pretraining framework for heterogeneous HyperGNNs.
Our method is able to effectively capture higher-order relations among entities in the data in a self-supervised manner.
Our experiments show that our proposed framework consistently outperforms state-of-the-art baselines in various downstream tasks.
arXiv Detail & Related papers (2023-11-19T16:34:56Z) - MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - UniG-Encoder: A Universal Feature Encoder for Graph and Hypergraph Node
Classification [6.977634174845066]
A universal feature encoder for both graph and hypergraph representation learning is designed, called UniG-Encoder.
The architecture starts with a forward transformation of the topological relationships of connected nodes into edge or hyperedge features.
The encoded node embeddings are then derived from the reversed transformation, described by the transpose of the projection matrix.
arXiv Detail & Related papers (2023-08-03T09:32:50Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Residual Enhanced Multi-Hypergraph Neural Network [26.42547421121713]
HyperGraph Neural Network (HGNN) is the de-facto method for hypergraph representation learning.
We propose the Residual enhanced Multi-Hypergraph Neural Network, which can fuse multi-modal information from each hypergraph effectively.
arXiv Detail & Related papers (2021-05-02T14:53:32Z) - A Unified Lottery Ticket Hypothesis for Graph Neural Networks [82.31087406264437]
We present a unified GNN sparsification (UGS) framework that simultaneously prunes the graph adjacency matrix and the model weights.
We further generalize the popular lottery ticket hypothesis to GNNs for the first time, by defining a graph lottery ticket (GLT) as a pair of core sub-dataset and sparse sub-network.
arXiv Detail & Related papers (2021-02-12T21:52:43Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - GPT-GNN: Generative Pre-Training of Graph Neural Networks [93.35945182085948]
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data.
We present the GPT-GNN framework to initialize GNNs by generative pre-training.
We show that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
arXiv Detail & Related papers (2020-06-27T20:12:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.