Expressive Higher-Order Link Prediction through Hypergraph Symmetry
Breaking
- URL: http://arxiv.org/abs/2402.11339v1
- Date: Sat, 17 Feb 2024 17:13:41 GMT
- Title: Expressive Higher-Order Link Prediction through Hypergraph Symmetry
Breaking
- Authors: Simon Zhang, Cheng Xin, Tamal K. Dey
- Abstract summary: Higher-order link prediction is the task of predicting the existence of a missing hyperedge in a hypergraph.
A hyperedge representation learned for higher order link prediction is fully expressive when it does not lose distinguishing power up to an isomorphism.
We devise a preprocessing algorithm that can identify certain regular subhypergraphs exhibiting symmetry.
- Score: 0.25322020135765466
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A hypergraph consists of a set of nodes along with a collection of subsets of
the nodes called hyperedges. Higher-order link prediction is the task of
predicting the existence of a missing hyperedge in a hypergraph. A hyperedge
representation learned for higher order link prediction is fully expressive
when it does not lose distinguishing power up to an isomorphism. Many existing
hypergraph representation learners, are bounded in expressive power by the
Generalized Weisfeiler Lehman-1 (GWL-1) algorithm, a generalization of the
Weisfeiler Lehman-1 algorithm. However, GWL-1 has limited expressive power. In
fact, induced subhypergraphs with identical GWL-1 valued nodes are
indistinguishable. Furthermore, message passing on hypergraphs can already be
computationally expensive, especially on GPU memory. To address these
limitations, we devise a preprocessing algorithm that can identify certain
regular subhypergraphs exhibiting symmetry. Our preprocessing algorithm runs
once with complexity the size of the input hypergraph. During training, we
randomly replace subhypergraphs identified by the algorithm with covering
hyperedges to break symmetry. We show that our method improves the expressivity
of GWL-1. Our extensive experiments also demonstrate the effectiveness of our
approach for higher-order link prediction on both graph and hypergraph datasets
with negligible change in computation.
Related papers
- Enhancing Hyperedge Prediction with Context-Aware Self-Supervised
Learning [64.46188414653204]
We propose a novel hyperedge prediction framework (CASH)
CASH employs context-aware node aggregation to capture complex relations among nodes in each hyperedge for (C1) and (2) self-supervised contrastive learning in the context of hyperedge prediction to enhance hypergraph representations for (C2)
Experiments on six real-world hypergraphs reveal that CASH consistently outperforms all competing methods in terms of the accuracy in hyperedge prediction.
arXiv Detail & Related papers (2023-09-11T20:06:00Z) - Hypergraph Structure Inference From Data Under Smoothness Prior [46.568839316694515]
We propose a method to infer the probability for each potential hyperedge without labelled data as supervision.
We use this prior to derive the relation between the hypergraph structure and the node features via probabilistic modelling.
Experiments on both synthetic and real-world data demonstrate that our method can learn meaningful hypergraph structures from data more efficiently than existing hypergraph structure inference methods.
arXiv Detail & Related papers (2023-08-27T18:28:58Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - HyperSF: Spectral Hypergraph Coarsening via Flow-based Local Clustering [9.438207505148947]
We propose an efficient spectral hypergraph coarsening scheme (HyperSF) for preserving the original spectral (structural) properties of hypergraphs.
Our results show that the proposed hypergraph coarsening algorithm can significantly improve the multi-way conductance of hypergraph clustering.
arXiv Detail & Related papers (2021-08-17T22:20:23Z) - Hyperedge Prediction using Tensor Eigenvalue Decomposition [21.673771194165276]
Link prediction in graphs is studied by modeling the dyadic interactions among two nodes.
Such interactions can be modeled using a hypergraph, which is a generalization of a graph where a hyperedge can connect more than two nodes.
We utilize the tensor-based representation of hypergraphs and propose a novel interpretation of the tensor eigenvectors.
arXiv Detail & Related papers (2021-02-06T01:55:04Z) - Hypergraph Partitioning using Tensor Eigenvalue Decomposition [19.01626581411011]
We propose a novel approach for the partitioning of k-uniform hypergraphs.
Most of the existing methods work by reducing the hypergraph to a graph followed by applying standard graph partitioning algorithms.
We overcome this issue by utilizing the tensor-based representation of hypergraphs.
arXiv Detail & Related papers (2020-11-16T01:55:43Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z) - Semi-supervised Hypergraph Node Classification on Hypergraph Line
Expansion [7.933465724913661]
We propose a new hypergraph formulation named the emphline expansion (LE) for hypergraphs learning.
The proposed emphline expansion makes existing graph learning algorithms compatible with the higher-order structure.
We evaluate the proposed line expansion on five hypergraph datasets, the results show that our method beats SOTA baselines by a significant margin.
arXiv Detail & Related papers (2020-05-11T03:02:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.