HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs
- URL: http://arxiv.org/abs/2010.04558v1
- Date: Fri, 9 Oct 2020 13:28:06 GMT
- Title: HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs
- Authors: Devanshu Arya, Deepak K. Gupta, Stevan Rudinac and Marcel Worring
- Abstract summary: We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
- Score: 24.737560790401314
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs are the most ubiquitous form of structured data representation used in
machine learning. They model, however, only pairwise relations between nodes
and are not designed for encoding the higher-order relations found in many
real-world datasets. To model such complex relations, hypergraphs have proven
to be a natural representation. Learning the node representations in a
hypergraph is more complex than in a graph as it involves information
propagation at two levels: within every hyperedge and across the hyperedges.
Most current approaches first transform a hypergraph structure to a graph for
use in existing geometric deep learning algorithms. This transformation leads
to information loss, and sub-optimal exploitation of the hypergraph's
expressive power. We present HyperSAGE, a novel hypergraph learning framework
that uses a two-level neural message passing strategy to accurately and
efficiently propagate information through hypergraphs. The flexible design of
HyperSAGE facilitates different ways of aggregating neighborhood information.
Unlike the majority of related work which is transductive, our approach,
inspired by the popular GraphSAGE method, is inductive. Thus, it can also be
used on previously unseen nodes, facilitating deployment in problems such as
evolving or partially observed hypergraphs. Through extensive experimentation,
we show that HyperSAGE outperforms state-of-the-art hypergraph learning methods
on representative benchmark datasets. We also demonstrate that the higher
expressive power of HyperSAGE makes it more stable in learning node
representations as compared to the alternatives.
Related papers
- Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Learning from Heterogeneity: A Dynamic Learning Framework for Hypergraphs [22.64740740462169]
We propose a hypergraph learning framework named LFH that is capable of dynamic hyperedge construction and attentive embedding update.
To evaluate the effectiveness of our proposed framework, we conduct comprehensive experiments on several popular datasets.
arXiv Detail & Related papers (2023-07-07T06:26:44Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - All the World's a (Hyper)Graph: A Data Drama [55.144729234861316]
Hyperbard is a dataset of diverse data representations from Shakespeare's plays.
Our representations range from simple graphs capturing character co-occurrence in single scenes to hypergraphs encoding complex communication settings.
As an homage to our data source, and asserting that science can also be art, we present all our points in the form of a play.
arXiv Detail & Related papers (2022-06-16T14:51:28Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-12T02:07:07Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-10T12:37:55Z) - Learning over Families of Sets -- Hypergraph Representation Learning for
Higher Order Tasks [12.28143554382742]
We develop a hypergraph neural network to learn provably expressive representations of variable sized hyperedges.
We evaluate performance on multiple real-world hypergraph datasets and demonstrate consistent, significant improvement in accuracy, over state-of-the-art models.
arXiv Detail & Related papers (2021-01-19T18:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.