HyperMagNet: A Magnetic Laplacian based Hypergraph Neural Network
- URL: http://arxiv.org/abs/2402.09676v1
- Date: Thu, 15 Feb 2024 03:05:45 GMT
- Title: HyperMagNet: A Magnetic Laplacian based Hypergraph Neural Network
- Authors: Tatyana Benko, Martin Buck, Ilya Amburg, Stephen J. Young, Sinan G.
Aksoy
- Abstract summary: We propose an alternative approach to hypergraph neural networks in which the hypergraph is represented as a non-reversible Markov chain.
We use this Markov chain to construct a complex Hermitian Laplacian matrix - the magnetic Laplacian - which serves as the input to our proposed hypergraph neural network.
- Score: 0.16874375111244327
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In data science, hypergraphs are natural models for data exhibiting multi-way
relations, whereas graphs only capture pairwise. Nonetheless, many proposed
hypergraph neural networks effectively reduce hypergraphs to undirected graphs
via symmetrized matrix representations, potentially losing important
information. We propose an alternative approach to hypergraph neural networks
in which the hypergraph is represented as a non-reversible Markov chain. We use
this Markov chain to construct a complex Hermitian Laplacian matrix - the
magnetic Laplacian - which serves as the input to our proposed hypergraph
neural network. We study HyperMagNet for the task of node classification, and
demonstrate its effectiveness over graph-reduction based hypergraph neural
networks.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - NetVec: A Scalable Hypergraph Embedding System [1.8979377273990425]
We introduce NetVec, a novel framework for scalable un-supervised hypergraph embedding.
We show that NetVec can becoupled with any graph embedding algorithm to produce embeddings of hypergraphs with millionsof nodes and hyperedges in a few minutes.
arXiv Detail & Related papers (2021-03-09T18:06:56Z) - Noise-robust classification with hypergraph neural network [4.003697389752555]
This paper presents a novel version of the hypergraph neural network method.
The accuracies of these five methods are evaluated and compared.
Experimental results show that the hypergraph neural network methods achieve the best performance when the noise level increases.
arXiv Detail & Related papers (2021-02-03T08:34:53Z) - Line Hypergraph Convolution Network: Applying Graph Convolution for
Hypergraphs [18.7475578342125]
We propose a novel technique to apply graph convolution on hypergraphs with variable hyperedge sizes.
We use the classical concept of line graph of a hypergraph for the first time in the hypergraph learning literature.
arXiv Detail & Related papers (2020-02-09T16:05:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.