SHINE: SubHypergraph Inductive Neural nEtwork
- URL: http://arxiv.org/abs/2210.07309v1
- Date: Thu, 13 Oct 2022 19:26:09 GMT
- Title: SHINE: SubHypergraph Inductive Neural nEtwork
- Authors: Yuan Luo
- Abstract summary: Hypergraph neural networks can model multi-way connections among nodes of the graphs.
There is an unmet need in learning powerful representations of subgraphs in real-world applications.
We propose SubHypergraph Inductive Neural nEtwork (SHINE) for accurate inductive subgraph prediction.
- Score: 5.9952530228468754
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hypergraph neural networks can model multi-way connections among nodes of the
graphs, which are common in real-world applications such as genetic medicine.
In particular, genetic pathways or gene sets encode molecular functions driven
by multiple genes, naturally represented as hyperedges. Thus, hypergraph-guided
embedding can capture functional relations in learned representations. Existing
hypergraph neural network models often focus on node-level or graph-level
inference. There is an unmet need in learning powerful representations of
subgraphs of hypergraphs in real-world applications. For example, a cancer
patient can be viewed as a subgraph of genes harboring mutations in the
patient, while all the genes are connected by hyperedges that correspond to
pathways representing specific molecular functions. For accurate inductive
subgraph prediction, we propose SubHypergraph Inductive Neural nEtwork (SHINE).
SHINE uses informative genetic pathways that encode molecular functions as
hyperedges to connect genes as nodes. SHINE jointly optimizes the objectives of
end-to-end subgraph classification and hypergraph nodes' similarity
regularization. SHINE simultaneously learns representations for both genes and
pathways using strongly dual attention message passing. The learned
representations are aggregated via a subgraph attention layer and used to train
a multilayer perceptron for inductive subgraph inferencing. We evaluated SHINE
against a wide array of state-of-the-art (hyper)graph neural networks, XGBoost,
NMF and polygenic risk score models, using large scale NGS and curated
datasets. SHINE outperformed all comparison models significantly, and yielded
interpretable disease models with functional insights.
Related papers
- Hypergraph-enhanced Dual Semi-supervised Graph Classification [14.339207883093204]
We propose a Hypergraph-Enhanced DuAL framework named HEAL for semi-supervised graph classification.
To better explore the higher-order relationships among nodes, we design a hypergraph structure learning to adaptively learn complex node dependencies.
Based on the learned hypergraph, we introduce a line graph to capture the interaction between hyperedges.
arXiv Detail & Related papers (2024-05-08T02:44:13Z) - Tensorized Hypergraph Neural Networks [69.65385474777031]
We propose a novel adjacency-tensor-based textbfTensorized textbfHypergraph textbfNeural textbfNetwork (THNN)
THNN is faithful hypergraph modeling framework through high-order outer product feature passing message.
Results from experiments on two widely used hypergraph datasets for 3-D visual object classification show the model's promising performance.
arXiv Detail & Related papers (2023-06-05T03:26:06Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Equivariant Hypergraph Diffusion Neural Operators [81.32770440890303]
Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data.
This work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators.
We evaluate ED-HNN for node classification on nine real-world hypergraph datasets.
arXiv Detail & Related papers (2022-07-14T06:17:00Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications [52.65389473899139]
Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
arXiv Detail & Related papers (2022-04-01T10:01:37Z) - Preventing Over-Smoothing for Hypergraph Neural Networks [0.0]
We show that the performance of hypergraph neural networks does not improve as the number of layers increases.
We develop a new deep hypergraph convolutional network called Deep-HGCN, which can maintain node representation in deep layers.
arXiv Detail & Related papers (2022-03-31T16:33:31Z) - VEGN: Variant Effect Prediction with Graph Neural Networks [19.59965282985234]
We propose VEGN, which models variant effect prediction using a graph neural network (GNN) that operates on a heterogeneous graph with genes and variants.
The graph is created by assigning variants to genes and connecting genes with an gene-gene interaction network.
VeGN improves the performance of existing state-of-the-art models.
arXiv Detail & Related papers (2021-06-25T13:51:46Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-12T02:07:07Z) - Learnable Hypergraph Laplacian for Hypergraph Learning [34.28748027233654]
HyperGraph Convolutional Neural Networks (HGCNNs) have demonstrated their potential in modeling high-order relations preserved in graph structured data.
We propose the first learning-based method tailored for constructing adaptive hypergraph structure, termed HypERgrAph Laplacian aDaptor (HERALD)
HERALD adaptively optimize the adjacency relationship between hypernodes and hyperedges in an end-to-end manner and thus the task-aware hypergraph is learned.
arXiv Detail & Related papers (2021-06-10T12:37:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.