Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs
- URL: http://arxiv.org/abs/2203.16939v1
- Date: Thu, 31 Mar 2022 10:46:47 GMT
- Title: Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs
- Authors: Jiying Zhang, Fuyang Li, Xi Xiao, Tingyang Xu, Yu Rong, Junzhou Huang
and Yatao Bian
- Abstract summary: We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
- Score: 59.71134113268709
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As a powerful tool for modeling complex relationships, hypergraphs are
gaining popularity from the graph learning community. However, commonly used
frameworks in deep hypergraph learning focus on hypergraphs with
\textit{edge-independent vertex weights}(EIVWs), without considering
hypergraphs with \textit{edge-dependent vertex weights} (EDVWs) that have more
modeling power. To compensate for this, in this paper, we present General
Hypergraph Spectral Convolution(GHSC), a general learning framework that not
only can handle EDVW and EIVW hypergraphs, but more importantly, enables
theoretically explicitly utilizing the existing powerful Graph Convolutional
Neural Networks (GCNNs) such that largely ease the design of Hypergraph Neural
Networks. In this framework, the graph Laplacian of the given undirected GCNNs
is replaced with a unified hypergraph Laplacian that incorporates vertex weight
information from a random walk perspective by equating our defined generalized
hypergraphs with simple undirected graphs. Extensive experiments from various
domains including social network analysis, visual objective classification,
protein learning demonstrate that the proposed framework can achieve
state-of-the-art performance.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks [8.777765815864367]
Hypergraph, an expressive structure with flexibility to model the higher-order correlations among entities, has recently attracted increasing attention from various research domains.
We propose UniGNN, a unified framework for interpreting the message passing process in graph and hypergraph neural networks.
Experiments have been conducted to demonstrate the effectiveness of UniGNN on multiple real-world datasets.
arXiv Detail & Related papers (2021-05-03T15:48:34Z) - NetVec: A Scalable Hypergraph Embedding System [1.8979377273990425]
We introduce NetVec, a novel framework for scalable un-supervised hypergraph embedding.
We show that NetVec can becoupled with any graph embedding algorithm to produce embeddings of hypergraphs with millionsof nodes and hyperedges in a few minutes.
arXiv Detail & Related papers (2021-03-09T18:06:56Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z) - HNHN: Hypergraph Networks with Hyperedge Neurons [90.15253035487314]
HNHN is a hypergraph convolution network with nonlinear activation functions applied to both hypernodes and hyperedges.
We demonstrate improved performance of HNHN in both classification accuracy and speed on real world datasets when compared to state of the art methods.
arXiv Detail & Related papers (2020-06-22T14:08:32Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z) - Line Hypergraph Convolution Network: Applying Graph Convolution for
Hypergraphs [18.7475578342125]
We propose a novel technique to apply graph convolution on hypergraphs with variable hyperedge sizes.
We use the classical concept of line graph of a hypergraph for the first time in the hypergraph learning literature.
arXiv Detail & Related papers (2020-02-09T16:05:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.