Hyperedge Prediction using Tensor Eigenvalue Decomposition
- URL: http://arxiv.org/abs/2102.04986v1
- Date: Sat, 6 Feb 2021 01:55:04 GMT
- Title: Hyperedge Prediction using Tensor Eigenvalue Decomposition
- Authors: Deepak Maurya, Balaraman Ravindran
- Abstract summary: Link prediction in graphs is studied by modeling the dyadic interactions among two nodes.
Such interactions can be modeled using a hypergraph, which is a generalization of a graph where a hyperedge can connect more than two nodes.
We utilize the tensor-based representation of hypergraphs and propose a novel interpretation of the tensor eigenvectors.
- Score: 21.673771194165276
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Link prediction in graphs is studied by modeling the dyadic interactions
among two nodes. The relationships can be more complex than simple dyadic
interactions and could require the user to model super-dyadic associations
among nodes. Such interactions can be modeled using a hypergraph, which is a
generalization of a graph where a hyperedge can connect more than two nodes.
In this work, we consider the problem of hyperedge prediction in a
$k-$uniform hypergraph. We utilize the tensor-based representation of
hypergraphs and propose a novel interpretation of the tensor eigenvectors. This
is further used to propose a hyperedge prediction algorithm. The proposed
algorithm utilizes the \textit{Fiedler} eigenvector computed using tensor
eigenvalue decomposition of hypergraph Laplacian. The \textit{Fiedler}
eigenvector is used to evaluate the construction cost of new hyperedges, which
is further utilized to determine the most probable hyperedges to be
constructed. The functioning and efficacy of the proposed method are
illustrated using some example hypergraphs and a few real datasets. The code
for the proposed method is available on https://github.com/d-maurya/hypred_
tensorEVD
Related papers
- Enhancing Hyperedge Prediction with Context-Aware Self-Supervised
Learning [64.46188414653204]
We propose a novel hyperedge prediction framework (CASH)
CASH employs context-aware node aggregation to capture complex relations among nodes in each hyperedge for (C1) and (2) self-supervised contrastive learning in the context of hyperedge prediction to enhance hypergraph representations for (C2)
Experiments on six real-world hypergraphs reveal that CASH consistently outperforms all competing methods in terms of the accuracy in hyperedge prediction.
arXiv Detail & Related papers (2023-09-11T20:06:00Z) - Hypergraph Structure Inference From Data Under Smoothness Prior [46.568839316694515]
We propose a method to infer the probability for each potential hyperedge without labelled data as supervision.
We use this prior to derive the relation between the hypergraph structure and the node features via probabilistic modelling.
Experiments on both synthetic and real-world data demonstrate that our method can learn meaningful hypergraph structures from data more efficiently than existing hypergraph structure inference methods.
arXiv Detail & Related papers (2023-08-27T18:28:58Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Classification of Edge-dependent Labels of Nodes in Hypergraphs [17.454063924648896]
We introduce a classification of edge-dependent node labels as a new problem.
This problem can be used as a benchmark task for hypergraph neural networks.
We propose WHATsNet, a novel hypergraph neural network that represents the same node differently depending on the hyperedges it participates in.
arXiv Detail & Related papers (2023-06-05T16:50:34Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Core-periphery Models for Hypergraphs [0.0]
We introduce a random hypergraph model for core-periphery structure.
We develop a novel statistical inference algorithm that is able to scale to large hypergraphs with runtime that is practically linear wrt.
Our inference algorithm is capable of learning embeddings that correspond to the reputation (rank) of a node within the hypergraph.
arXiv Detail & Related papers (2022-06-01T22:11:44Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Love tHy Neighbour: Remeasuring Local Structural Node Similarity in
Hypergraph-Derived Networks [2.246222223318928]
We propose a multitude of hypergraph-oriented similarity scores between node-pairs.
We provide theoretical formulations to extend graph-topology based scores to hypergraphs.
arXiv Detail & Related papers (2021-10-30T14:12:58Z) - Hypergraph Partitioning using Tensor Eigenvalue Decomposition [19.01626581411011]
We propose a novel approach for the partitioning of k-uniform hypergraphs.
Most of the existing methods work by reducing the hypergraph to a graph followed by applying standard graph partitioning algorithms.
We overcome this issue by utilizing the tensor-based representation of hypergraphs.
arXiv Detail & Related papers (2020-11-16T01:55:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.