A Unified View Between Tensor Hypergraph Neural Networks And Signal
Denoising
- URL: http://arxiv.org/abs/2309.08385v1
- Date: Fri, 15 Sep 2023 13:19:31 GMT
- Title: A Unified View Between Tensor Hypergraph Neural Networks And Signal
Denoising
- Authors: Fuli Wang, Karelia Pena-Pena, Wei Qian, Gonzalo R. Arce
- Abstract summary: We show that the tensor-hypergraph convolutional network (T-HGCN) has emerged as a powerful architecture for preserving higher-order interactions on hypergraphs.
We further design a tensor-hypergraph iterative network (T-HGIN) based on the HyperGSD problem, which takes advantage of a multi-step updating scheme in every single layer.
- Score: 7.083679120873857
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hypergraph Neural networks (HyperGNNs) and hypergraph signal denoising
(HyperGSD) are two fundamental topics in higher-order network modeling.
Understanding the connection between these two domains is particularly useful
for designing novel HyperGNNs from a HyperGSD perspective, and vice versa. In
particular, the tensor-hypergraph convolutional network (T-HGCN) has emerged as
a powerful architecture for preserving higher-order interactions on
hypergraphs, and this work shows an equivalence relation between a HyperGSD
problem and the T-HGCN. Inspired by this intriguing result, we further design a
tensor-hypergraph iterative network (T-HGIN) based on the HyperGSD problem,
which takes advantage of a multi-step updating scheme in every single layer.
Numerical experiments are conducted to show the promising applications of the
proposed T-HGIN approach.
Related papers
- DPHGNN: A Dual Perspective Hypergraph Neural Networks [15.079509975815572]
We propose DPHGNN, a novel dual-perspective HGNN that introduces equivariant operator learning to capture lower-order semantics.
We benchmark DPHGNN over eight benchmark hypergraph datasets for the semi-supervised hypernode classification task.
DPHGNN was deployed by our partner e-commerce company for the Return-to-Origin (RTO) prediction task, which shows 7% higher macro F1-Score than the best baseline.
arXiv Detail & Related papers (2024-05-26T16:08:55Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Preventing Over-Smoothing for Hypergraph Neural Networks [0.0]
We show that the performance of hypergraph neural networks does not improve as the number of layers increases.
We develop a new deep hypergraph convolutional network called Deep-HGCN, which can maintain node representation in deep layers.
arXiv Detail & Related papers (2022-03-31T16:33:31Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Residual Enhanced Multi-Hypergraph Neural Network [26.42547421121713]
HyperGraph Neural Network (HGNN) is the de-facto method for hypergraph representation learning.
We propose the Residual enhanced Multi-Hypergraph Neural Network, which can fuse multi-modal information from each hypergraph effectively.
arXiv Detail & Related papers (2021-05-02T14:53:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.