Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for
Molecular Structures
- URL: http://arxiv.org/abs/2011.07457v1
- Date: Sun, 15 Nov 2020 05:55:15 GMT
- Title: Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for
Molecular Structures
- Authors: Shuo Zhang, Yang Liu, Lei Xie
- Abstract summary: A growing number of Graph Neural Networks (GNNs) have been proposed to address this challenge.
In this work, we aim to design a GNN which is both powerful and efficient for molecule structures.
We build Multiplex Molecular Graph Neural Network (MXMNet)
- Score: 20.276492931562036
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prediction of physicochemical properties from molecular structures is a
crucial task for artificial intelligence aided molecular design. A growing
number of Graph Neural Networks (GNNs) have been proposed to address this
challenge. These models improve their expressive power by incorporating
auxiliary information in molecules while inevitably increase their
computational complexity. In this work, we aim to design a GNN which is both
powerful and efficient for molecule structures. To achieve such goal, we
propose a molecular mechanics-driven approach by first representing each
molecule as a two-layer multiplex graph, where one layer contains only local
connections that mainly capture the covalent interactions and another layer
contains global connections that can simulate non-covalent interactions. Then
for each layer, a corresponding message passing module is proposed to balance
the trade-off of expression power and computational complexity. Based on these
two modules, we build Multiplex Molecular Graph Neural Network (MXMNet). When
validated by the QM9 dataset for small molecules and PDBBind dataset for large
protein-ligand complexes, MXMNet achieves superior results to the existing
state-of-the-art models under restricted resources.
Related papers
- GraphXForm: Graph transformer for computer-aided molecular design with application to extraction [73.1842164721868]
We present GraphXForm, a decoder-only graph transformer architecture, which is pretrained on existing compounds and then fine-tuned.
We evaluate it on two solvent design tasks for liquid-liquid extraction, showing that it outperforms four state-of-the-art molecular design techniques.
arXiv Detail & Related papers (2024-11-03T19:45:15Z) - Molecular Graph Representation Learning via Structural Similarity Information [11.38130169319915]
We introduce the textbf Structural Similarity Motif GNN (MSSM-GNN), a novel molecular graph representation learning method.
In particular, we propose a specially designed graph that leverages graph kernel algorithms to represent the similarity between molecules quantitatively.
We employ GNNs to learn feature representations from molecular graphs, aiming to enhance the accuracy of property prediction by incorporating additional molecular representation information.
arXiv Detail & Related papers (2024-09-13T06:59:10Z) - Bi-level Contrastive Learning for Knowledge-Enhanced Molecule
Representations [55.42602325017405]
We propose a novel method called GODE, which takes into account the two-level structure of individual molecules.
By pre-training two graph neural networks (GNNs) on different graph structures, combined with contrastive learning, GODE fuses molecular structures with their corresponding knowledge graph substructures.
When fine-tuned across 11 chemical property tasks, our model outperforms existing benchmarks, registering an average ROC-AUC uplift of 13.8% for classification tasks and an average RMSE/MAE enhancement of 35.1% for regression tasks.
arXiv Detail & Related papers (2023-06-02T15:49:45Z) - Implicit Geometry and Interaction Embeddings Improve Few-Shot Molecular
Property Prediction [53.06671763877109]
We develop molecular embeddings that encode complex molecular characteristics to improve the performance of few-shot molecular property prediction.
Our approach leverages large amounts of synthetic data, namely the results of molecular docking calculations.
On multiple molecular property prediction benchmarks, training from the embedding space substantially improves Multi-Task, MAML, and Prototypical Network few-shot learning performance.
arXiv Detail & Related papers (2023-02-04T01:32:40Z) - MolCLR: Molecular Contrastive Learning of Representations via Graph
Neural Networks [11.994553575596228]
MolCLR is a self-supervised learning framework for large unlabeled molecule datasets.
We propose three novel molecule graph augmentations: atom masking, bond deletion, and subgraph removal.
Our method achieves state-of-the-art performance on many challenging datasets.
arXiv Detail & Related papers (2021-02-19T17:35:18Z) - Heterogeneous Molecular Graph Neural Networks for Predicting Molecule
Properties [12.897488702184306]
We introduce a novel graph representation of molecules, heterogeneous molecular graph (HMG)
HMGNN incorporates global molecule representations and an attention mechanism into the prediction process.
Our model achieves state-of-the-art performance in 9 out of 12 tasks on the QM9 dataset.
arXiv Detail & Related papers (2020-09-26T23:29:41Z) - ASGN: An Active Semi-supervised Graph Neural Network for Molecular
Property Prediction [61.33144688400446]
We propose a novel framework called Active Semi-supervised Graph Neural Network (ASGN) by incorporating both labeled and unlabeled molecules.
In the teacher model, we propose a novel semi-supervised learning method to learn general representation that jointly exploits information from molecular structure and molecular distribution.
At last, we proposed a novel active learning strategy in terms of molecular diversities to select informative data during the whole framework learning.
arXiv Detail & Related papers (2020-07-07T04:22:39Z) - Self-Supervised Graph Transformer on Large-Scale Molecular Data [73.3448373618865]
We propose a novel framework, GROVER, for molecular representation learning.
GROVER can learn rich structural and semantic information of molecules from enormous unlabelled molecular data.
We pre-train GROVER with 100 million parameters on 10 million unlabelled molecules -- the biggest GNN and the largest training dataset in molecular representation learning.
arXiv Detail & Related papers (2020-06-18T08:37:04Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.