Residual Enhanced Multi-Hypergraph Neural Network
- URL: http://arxiv.org/abs/2105.00490v1
- Date: Sun, 2 May 2021 14:53:32 GMT
- Title: Residual Enhanced Multi-Hypergraph Neural Network
- Authors: Jing Huang, Xiaolin Huang and Jie Yang
- Abstract summary: HyperGraph Neural Network (HGNN) is the de-facto method for hypergraph representation learning.
We propose the Residual enhanced Multi-Hypergraph Neural Network, which can fuse multi-modal information from each hypergraph effectively.
- Score: 26.42547421121713
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Hypergraphs are a generalized data structure of graphs to model higher-order
correlations among entities, which have been successfully adopted into various
research domains. Meanwhile, HyperGraph Neural Network (HGNN) is currently the
de-facto method for hypergraph representation learning. However, HGNN aims at
single hypergraph learning and uses a pre-concatenation approach when
confronting multi-modal datasets, which leads to sub-optimal exploitation of
the inter-correlations of multi-modal hypergraphs. HGNN also suffers the
over-smoothing issue, that is, its performance drops significantly when layers
are stacked up. To resolve these issues, we propose the Residual enhanced
Multi-Hypergraph Neural Network, which can not only fuse multi-modal
information from each hypergraph effectively, but also circumvent the
over-smoothing issue associated with HGNN. We conduct experiments on two 3D
benchmarks, the NTU and the ModelNet40 datasets, and compare against multiple
state-of-the-art methods. Experimental results demonstrate that both the
residual hypergraph convolutions and the multi-fusion architecture can improve
the performance of the base model and the combined model achieves a new
state-of-the-art. Code is available at
\url{https://github.com/OneForward/ResMHGNN}.
Related papers
- Scalable Weibull Graph Attention Autoencoder for Modeling Document Networks [50.42343781348247]
We develop a graph Poisson factor analysis (GPFA) which provides analytic conditional posteriors to improve the inference accuracy.
We also extend GPFA to a multi-stochastic-layer version named graph Poisson gamma belief network (GPGBN) to capture the hierarchical document relationships at multiple semantic levels.
Our models can extract high-quality hierarchical latent document representations and achieve promising performance on various graph analytic tasks.
arXiv Detail & Related papers (2024-10-13T02:22:14Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Self-Supervised Pretraining for Heterogeneous Hypergraph Neural Networks [9.987252149421982]
We present a novel self-supervised pretraining framework for heterogeneous HyperGNNs.
Our method is able to effectively capture higher-order relations among entities in the data in a self-supervised manner.
Our experiments show that our proposed framework consistently outperforms state-of-the-art baselines in various downstream tasks.
arXiv Detail & Related papers (2023-11-19T16:34:56Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - DualHGNN: A Dual Hypergraph Neural Network for Semi-Supervised Node
Classification based on Multi-View Learning and Density Awareness [3.698434507617248]
Graph-based semi-supervised node classification has been shown to become a state-of-the-art approach in many applications with high research value and significance.
This paper proposes the Dual Hypergraph Neural Network (DualHGNN), a new dual connection model integrating both hypergraph structure learning and hypergraph representation learning simultaneously in a unified architecture.
arXiv Detail & Related papers (2023-06-07T07:40:04Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Preventing Over-Smoothing for Hypergraph Neural Networks [0.0]
We show that the performance of hypergraph neural networks does not improve as the number of layers increases.
We develop a new deep hypergraph convolutional network called Deep-HGCN, which can maintain node representation in deep layers.
arXiv Detail & Related papers (2022-03-31T16:33:31Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks [8.777765815864367]
Hypergraph, an expressive structure with flexibility to model the higher-order correlations among entities, has recently attracted increasing attention from various research domains.
We propose UniGNN, a unified framework for interpreting the message passing process in graph and hypergraph neural networks.
Experiments have been conducted to demonstrate the effectiveness of UniGNN on multiple real-world datasets.
arXiv Detail & Related papers (2021-05-03T15:48:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.