Hypergraph Dissimilarity Measures
- URL: http://arxiv.org/abs/2106.08206v1
- Date: Tue, 15 Jun 2021 15:10:24 GMT
- Title: Hypergraph Dissimilarity Measures
- Authors: Amit Surana, Can Chen and Indika Rajapakse
- Abstract summary: We present measures that assess hypergraph dissimilarity at a specific scale or provide a more holistic multi-scale comparison.
We test these measures on synthetic hypergraphs and apply them to biological datasets.
- Score: 8.890638003061605
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose two novel approaches for hypergraph comparison. The
first approach transforms the hypergraph into a graph representation for use of
standard graph dissimilarity measures. The second approach exploits the
mathematics of tensors to intrinsically capture multi-way relations. For each
approach, we present measures that assess hypergraph dissimilarity at a
specific scale or provide a more holistic multi-scale comparison. We test these
measures on synthetic hypergraphs and apply them to biological datasets.
Related papers
- A classification model based on a population of hypergraphs [0.0]
This paper introduces a novel hypergraph classification algorithm.
Hypergraphs explore multi-way interactions of any order.
The algorithm is evaluated on two datasets.
arXiv Detail & Related papers (2024-05-23T21:21:59Z) - Hypergraph Structure Inference From Data Under Smoothness Prior [46.568839316694515]
We propose a method to infer the probability for each potential hyperedge without labelled data as supervision.
We use this prior to derive the relation between the hypergraph structure and the node features via probabilistic modelling.
Experiments on both synthetic and real-world data demonstrate that our method can learn meaningful hypergraph structures from data more efficiently than existing hypergraph structure inference methods.
arXiv Detail & Related papers (2023-08-27T18:28:58Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Augmentations in Hypergraph Contrastive Learning: Fabricated and
Generative [126.0985540285981]
We apply the contrastive learning approach from images/graphs (we refer to it as HyperGCL) to improve generalizability of hypergraph neural networks.
We fabricate two schemes to augment hyperedges with higher-order relations encoded, and adopt three augmentation strategies from graph-structured data.
We propose a hypergraph generative model to generate augmented views, and then an end-to-end differentiable pipeline to jointly learn hypergraph augmentations and model parameters.
arXiv Detail & Related papers (2022-10-07T20:12:20Z) - Collaborative likelihood-ratio estimation over graphs [55.98760097296213]
Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We develop this idea in a concrete non-parametric method that we call Graph-based Relative Unconstrained Least-squares Importance Fitting (GRULSIF)
We derive convergence rates for our collaborative approach that highlights the role played by variables such as the number of available observations per node, the size of the graph, and how accurately the graph structure encodes the similarity between tasks.
arXiv Detail & Related papers (2022-05-28T15:37:03Z) - Message Passing Neural Networks for Hypergraphs [6.999112784624749]
We present the first graph neural network based on message passing capable of processing hypergraph-structured data.
We show that the proposed model defines a design space for neural network models for hypergraphs, thus generalizing existing models for hypergraphs.
arXiv Detail & Related papers (2022-03-31T12:38:22Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Love tHy Neighbour: Remeasuring Local Structural Node Similarity in
Hypergraph-Derived Networks [2.246222223318928]
We propose a multitude of hypergraph-oriented similarity scores between node-pairs.
We provide theoretical formulations to extend graph-topology based scores to hypergraphs.
arXiv Detail & Related papers (2021-10-30T14:12:58Z) - Hypergraph Partitioning using Tensor Eigenvalue Decomposition [19.01626581411011]
We propose a novel approach for the partitioning of k-uniform hypergraphs.
Most of the existing methods work by reducing the hypergraph to a graph followed by applying standard graph partitioning algorithms.
We overcome this issue by utilizing the tensor-based representation of hypergraphs.
arXiv Detail & Related papers (2020-11-16T01:55:43Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.