SkipGNN: Predicting Molecular Interactions with Skip-Graph Networks
- URL: http://arxiv.org/abs/2004.14949v2
- Date: Wed, 9 Dec 2020 18:31:39 GMT
- Title: SkipGNN: Predicting Molecular Interactions with Skip-Graph Networks
- Authors: Kexin Huang, Cao Xiao, Lucas Glass, Marinka Zitnik, Jimeng Sun
- Abstract summary: We present SkipGNN, a graph neural network approach for the prediction of molecular interactions.
SkipGNN predicts molecular interactions by not only aggregating information from direct interactions but also from second-order interactions.
We show that SkipGNN achieves superior and robust performance, outperforming existing methods by up to 28.8% of area.
- Score: 70.64925872964416
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Molecular interaction networks are powerful resources for the discovery. They
are increasingly used with machine learning methods to predict biologically
meaningful interactions. While deep learning on graphs has dramatically
advanced the prediction prowess, current graph neural network (GNN) methods are
optimized for prediction on the basis of direct similarity between interacting
nodes. In biological networks, however, similarity between nodes that do not
directly interact has proved incredibly useful in the last decade across a
variety of interaction networks. Here, we present SkipGNN, a graph neural
network approach for the prediction of molecular interactions. SkipGNN predicts
molecular interactions by not only aggregating information from direct
interactions but also from second-order interactions, which we call skip
similarity. In contrast to existing GNNs, SkipGNN receives neural messages from
two-hop neighbors as well as immediate neighbors in the interaction network and
non-linearly transforms the messages to obtain useful information for
prediction. To inject skip similarity into a GNN, we construct a modified
version of the original network, called the skip graph. We then develop an
iterative fusion scheme that optimizes a GNN using both the skip graph and the
original graph. Experiments on four interaction networks, including drug-drug,
drug-target, protein-protein, and gene-disease interactions, show that SkipGNN
achieves superior and robust performance, outperforming existing methods by up
to 28.8\% of area under the precision recall curve (PR-AUC). Furthermore, we
show that unlike popular GNNs, SkipGNN learns biologically meaningful
embeddings and performs especially well on noisy, incomplete interaction
networks.
Related papers
- PROXI: Challenging the GNNs for Link Prediction [3.8233569758620063]
We introduce PROXI, which leverages proximity information of node pairs in both graph and attribute spaces.
Standard machine learning (ML) models perform competitively, even outperforming cutting-edge GNN models.
We show that augmenting traditional GNNs with PROXI significantly boosts their link prediction performance.
arXiv Detail & Related papers (2024-10-02T17:57:38Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - From Node Interaction to Hop Interaction: New Effective and Scalable
Graph Learning Paradigm [25.959580336262004]
We propose a novel hop interaction paradigm to address limitations simultaneously.
The core idea is to convert the interaction target among nodes to pre-processed multi-hop features inside each node.
We conduct extensive experiments on 12 benchmark datasets in a wide range of domains, scales, and smoothness of graphs.
arXiv Detail & Related papers (2022-11-21T11:29:48Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Structure-aware Interactive Graph Neural Networks for the Prediction of
Protein-Ligand Binding Affinity [52.67037774136973]
Drug discovery often relies on the successful prediction of protein-ligand binding affinity.
Recent advances have shown great promise in applying graph neural networks (GNNs) for better affinity prediction by learning the representations of protein-ligand complexes.
We propose a structure-aware interactive graph neural network (SIGN) which consists of two components: polar-inspired graph attention layers (PGAL) and pairwise interactive pooling (PiPool)
arXiv Detail & Related papers (2021-07-21T03:34:09Z) - Visualizing Graph Neural Networks with CorGIE: Corresponding a Graph to
Its Embedding [16.80197065484465]
We propose an approach to corresponding an input graph to its node embedding (aka latent space)
We develop an interactive multi-view interface called CorGIE to instantiate the abstraction.
We present how to use CorGIE in two usage scenarios, and conduct a case study with two GNN experts.
arXiv Detail & Related papers (2021-06-24T08:59:53Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.