HyperSMOTE: A Hypergraph-based Oversampling Approach for Imbalanced Node Classifications
- URL: http://arxiv.org/abs/2409.05402v1
- Date: Mon, 9 Sep 2024 08:01:28 GMT
- Title: HyperSMOTE: A Hypergraph-based Oversampling Approach for Imbalanced Node Classifications
- Authors: Ziming Zhao, Tiehua Zhang, Zijian Yi, Zhishu Shen,
- Abstract summary: We propose HyperSMOTE as a solution to alleviate the class imbalance issue in hypergraph learning.
We synthesize new nodes based on samples from minority classes and their neighbors.
In order to solve the problem on integrating the new node into the hypergraph, we train a decoder based on the original hypergraph incidence matrix.
- Score: 2.172034690069702
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hypergraphs are increasingly utilized in both unimodal and multimodal data scenarios due to their superior ability to model and extract higher-order relationships among nodes, compared to traditional graphs. However, current hypergraph models are encountering challenges related to imbalanced data, as this imbalance can lead to biases in the model towards the more prevalent classes. While the existing techniques, such as GraphSMOTE, have improved classification accuracy for minority samples in graph data, they still fall short when addressing the unique structure of hypergraphs. Inspired by SMOTE concept, we propose HyperSMOTE as a solution to alleviate the class imbalance issue in hypergraph learning. This method involves a two-step process: initially synthesizing minority class nodes, followed by the nodes integration into the original hypergraph. We synthesize new nodes based on samples from minority classes and their neighbors. At the same time, in order to solve the problem on integrating the new node into the hypergraph, we train a decoder based on the original hypergraph incidence matrix to adaptively associate the augmented node to hyperedges. We conduct extensive evaluation on multiple single-modality datasets, such as Cora, Cora-CA and Citeseer, as well as multimodal conversation dataset MELD to verify the effectiveness of HyperSMOTE, showing an average performance gain of 3.38% and 2.97% on accuracy, respectively.
Related papers
- SPHINX: Structural Prediction using Hypergraph Inference Network [19.853413818941608]
We introduce Structural Prediction using Hypergraph Inference Network (SPHINX), a model that learns to infer a latent hypergraph structure in an unsupervised way.
We show that the recent advancement in k-subset sampling represents a suitable tool for producing discrete hypergraph structures.
The resulting model can generate the higher-order structure necessary for any modern hypergraph neural network.
arXiv Detail & Related papers (2024-10-04T07:49:57Z) - Generation is better than Modification: Combating High Class Homophily Variance in Graph Anomaly Detection [51.11833609431406]
Homophily distribution differences between different classes are significantly greater than those in homophilic and heterophilic graphs.
We introduce a new metric called Class Homophily Variance, which quantitatively describes this phenomenon.
To mitigate its impact, we propose a novel GNN model named Homophily Edge Generation Graph Neural Network (HedGe)
arXiv Detail & Related papers (2024-03-15T14:26:53Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Hypergraph Convolutional Networks via Equivalency between Hypergraphs
and Undirected Graphs [59.71134113268709]
We present General Hypergraph Spectral Convolution(GHSC), a general learning framework that can handle EDVW and EIVW hypergraphs.
In this paper, we show that the proposed framework can achieve state-of-the-art performance.
Experiments from various domains including social network analysis, visual objective classification, protein learning demonstrate that the proposed framework can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-31T10:46:47Z) - Adaptive Neural Message Passing for Inductive Learning on Hypergraphs [21.606287447052757]
We present HyperMSG, a novel hypergraph learning framework.
It adapts to the data and task by learning an attention weight associated with each node's degree centrality.
It is robust and outperforms state-of-the-art hypergraph learning methods on a wide range of tasks and datasets.
arXiv Detail & Related papers (2021-09-22T12:24:02Z) - Residual Enhanced Multi-Hypergraph Neural Network [26.42547421121713]
HyperGraph Neural Network (HGNN) is the de-facto method for hypergraph representation learning.
We propose the Residual enhanced Multi-Hypergraph Neural Network, which can fuse multi-modal information from each hypergraph effectively.
arXiv Detail & Related papers (2021-05-02T14:53:32Z) - Generative hypergraph clustering: from blockmodels to modularity [26.99290024958576]
We propose an expressive generative model of clustered hypergraphs with heterogeneous node degrees and edge sizes.
We show that hypergraph Louvain is highly scalable, including as an example an experiment on a synthetic hypergraph of one million nodes.
We use our model to analyze different patterns of higher-order structure in school contact networks, U.S. congressional bill cosponsorship, U.S. congressional committees, product categories in co-purchasing behavior, and hotel locations.
arXiv Detail & Related papers (2021-01-24T00:25:22Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.