Classification of Edge-dependent Labels of Nodes in Hypergraphs
- URL: http://arxiv.org/abs/2306.03032v1
- Date: Mon, 5 Jun 2023 16:50:34 GMT
- Title: Classification of Edge-dependent Labels of Nodes in Hypergraphs
- Authors: Minyoung Choe, Sunwoo Kim, Jaemin Yoo, Kijung Shin
- Abstract summary: We introduce a classification of edge-dependent node labels as a new problem.
This problem can be used as a benchmark task for hypergraph neural networks.
We propose WHATsNet, a novel hypergraph neural network that represents the same node differently depending on the hyperedges it participates in.
- Score: 17.454063924648896
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: A hypergraph is a data structure composed of nodes and hyperedges, where each
hyperedge is an any-sized subset of nodes. Due to the flexibility in hyperedge
size, hypergraphs represent group interactions (e.g., co-authorship by more
than two authors) more naturally and accurately than ordinary graphs.
Interestingly, many real-world systems modeled as hypergraphs contain
edge-dependent node labels, i.e., node labels that vary depending on
hyperedges. For example, on co-authorship datasets, the same author (i.e., a
node) can be the primary author in a paper (i.e., a hyperedge) but the
corresponding author in another paper (i.e., another hyperedge).
In this work, we introduce a classification of edge-dependent node labels as
a new problem. This problem can be used as a benchmark task for hypergraph
neural networks, which recently have attracted great attention, and also the
usefulness of edge-dependent node labels has been verified in various
applications. To tackle this problem, we propose WHATsNet, a novel hypergraph
neural network that represents the same node differently depending on the
hyperedges it participates in by reflecting its varying importance in the
hyperedges. To this end, WHATsNet models the relations between nodes within
each hyperedge, using their relative centrality as positional encodings. In our
experiments, we demonstrate that WHATsNet significantly and consistently
outperforms ten competitors on six real-world hypergraphs, and we also show
successful applications of WHATsNet to (a) ranking aggregation, (b) node
clustering, and (c) product return prediction.
Related papers
- Hyperedge Modeling in Hypergraph Neural Networks by using Densest Overlapping Subgraphs [0.0]
One of the most important problems in graph clustering is to find densest overlapping subgraphs (DOS)
In this paper, we propose a solution to the DOS problem via Agglomerativedyion (DOSAGE) algorithm as a novel approach to enhance the process of generating the densest overlapping subgraphs.
Experiments on standard benchmarks show that the DOSAGE algorithm significantly outperforms the HGNNs and six other methods on the node classification task.
arXiv Detail & Related papers (2024-09-16T14:56:10Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - BOURNE: Bootstrapped Self-supervised Learning Framework for Unified
Graph Anomaly Detection [50.26074811655596]
We propose a novel unified graph anomaly detection framework based on bootstrapped self-supervised learning (named BOURNE)
By swapping the context embeddings between nodes and edges, we enable the mutual detection of node and edge anomalies.
BOURNE can eliminate the need for negative sampling, thereby enhancing its efficiency in handling large graphs.
arXiv Detail & Related papers (2023-07-28T00:44:57Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - A Hypergraph Neural Network Framework for Learning Hyperedge-Dependent
Node Embeddings [39.9678554461845]
We introduce a hypergraph representation learning framework called Hypergraph Neural Networks (HNN)
HNN jointly learns hyperedge embeddings along with a set of hyperedge-dependent embeddings for each node in the hypergraph.
We find that HNN achieves an overall mean gain of 7.72% and 11.37% across all baseline models and graphs for hyperedge prediction and hypergraph node classification.
arXiv Detail & Related papers (2022-12-28T19:45:38Z) - Semi-Supervised Hierarchical Graph Classification [54.25165160435073]
We study the node classification problem in the hierarchical graph where a 'node' is a graph instance.
We propose the Hierarchical Graph Mutual Information (HGMI) and present a way to compute HGMI with theoretical guarantee.
We demonstrate the effectiveness of this hierarchical graph modeling and the proposed SEAL-CI method on text and social network data.
arXiv Detail & Related papers (2022-06-11T04:05:29Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - HEAT: Hyperedge Attention Networks [34.65832569321654]
HEAT is a neural model capable of representing typed and qualified hypergraphs.
It can be viewed as a generalization of both message passing neural networks and Transformers.
We evaluate it on knowledge base completion and on bug detection and repair using a novel hypergraph representation of programs.
arXiv Detail & Related papers (2022-01-28T13:42:01Z) - Love tHy Neighbour: Remeasuring Local Structural Node Similarity in
Hypergraph-Derived Networks [2.246222223318928]
We propose a multitude of hypergraph-oriented similarity scores between node-pairs.
We provide theoretical formulations to extend graph-topology based scores to hypergraphs.
arXiv Detail & Related papers (2021-10-30T14:12:58Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.