Multi-View Graph Neural Networks for Molecular Property Prediction
- URL: http://arxiv.org/abs/2005.13607v3
- Date: Fri, 12 Jun 2020 06:09:52 GMT
- Title: Multi-View Graph Neural Networks for Molecular Property Prediction
- Authors: Hehuan Ma, Yatao Bian, Yu Rong, Wenbing Huang, Tingyang Xu, Weiyang
Xie, Geyan Ye, Junzhou Huang
- Abstract summary: We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
- Score: 67.54644592806876
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The crux of molecular property prediction is to generate meaningful
representations of the molecules. One promising route is to exploit the
molecular graph structure through Graph Neural Networks (GNNs). It is well
known that both atoms and bonds significantly affect the chemical properties of
a molecule, so an expressive model shall be able to exploit both node (atom)
and edge (bond) information simultaneously. Guided by this observation, we
present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing
architecture to enable more accurate predictions of molecular properties. In
MV-GNN, we introduce a shared self-attentive readout component and disagreement
loss to stabilize the training process. This readout component also renders the
whole architecture interpretable. We further boost the expressive power of
MV-GNN by proposing a cross-dependent message passing scheme that enhances
information communication of the two views, which results in the MV-GNN^cross
variant. Lastly, we theoretically justify the expressiveness of the two
proposed models in terms of distinguishing non-isomorphism graphs. Extensive
experiments demonstrate that MV-GNN models achieve remarkably superior
performance over the state-of-the-art models on a variety of challenging
benchmarks. Meanwhile, visualization results of the node importance are
consistent with prior knowledge, which confirms the interpretability power of
MV-GNN models.
Related papers
- Molecular Graph Representation Learning via Structural Similarity Information [11.38130169319915]
We introduce the textbf Structural Similarity Motif GNN (MSSM-GNN), a novel molecular graph representation learning method.
In particular, we propose a specially designed graph that leverages graph kernel algorithms to represent the similarity between molecules quantitatively.
We employ GNNs to learn feature representations from molecular graphs, aiming to enhance the accuracy of property prediction by incorporating additional molecular representation information.
arXiv Detail & Related papers (2024-09-13T06:59:10Z) - Contrastive Dual-Interaction Graph Neural Network for Molecular Property Prediction [0.0]
We introduce DIG-Mol, a novel self-supervised graph neural network framework for molecular property prediction.
DIG-Mol integrates a momentum distillation network with two interconnected networks to efficiently improve molecular characterization.
We have established DIG-Mol's state-of-the-art performance through extensive experimental evaluation in a variety of molecular property prediction tasks.
arXiv Detail & Related papers (2024-05-04T10:09:27Z) - Atomic and Subgraph-aware Bilateral Aggregation for Molecular
Representation Learning [57.670845619155195]
We introduce a new model for molecular representation learning called the Atomic and Subgraph-aware Bilateral Aggregation (ASBA)
ASBA addresses the limitations of previous atom-wise and subgraph-wise models by incorporating both types of information.
Our method offers a more comprehensive way to learn representations for molecular property prediction and has broad potential in drug and material discovery applications.
arXiv Detail & Related papers (2023-05-22T00:56:00Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Image-Like Graph Representations for Improved Molecular Property
Prediction [7.119677737397071]
We propose a new intrinsic molecular representation that bypasses the need for GNNs entirely, dubbed CubeMol.
Our fixed-dimensional representation, when paired with a transformer model, exceeds the performance of state-of-the-art GNN models and provides a path for scalability.
arXiv Detail & Related papers (2021-11-20T22:39:11Z) - Learning Attributed Graph Representations with Communicative Message
Passing Transformer [3.812358821429274]
We propose a Communicative Message Passing Transformer (CoMPT) neural network to improve the molecular graph representation.
Unlike the previous transformer-style GNNs that treat molecules as fully connected graphs, we introduce a message diffusion mechanism to leverage the graph connectivity inductive bias.
arXiv Detail & Related papers (2021-07-19T11:58:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.