Neural Atoms: Propagating Long-range Interaction in Molecular Graphs through Efficient Communication Channel
- URL: http://arxiv.org/abs/2311.01276v3
- Date: Sun, 31 Mar 2024 14:28:51 GMT
- Title: Neural Atoms: Propagating Long-range Interaction in Molecular Graphs through Efficient Communication Channel
- Authors: Xuan Li, Zhanke Zhou, Jiangchao Yao, Yu Rong, Lu Zhang, Bo Han,
- Abstract summary: We propose a method to abstract the collective information of atomic groups into a few $textitNeural Atoms$ by implicitly projecting the atoms of a molecular.
Specifically, we explicitly exchange the information among neural atoms and project them back to the atoms' representations as an enhancement.
With this mechanism, neural atoms establish the communication channels among distant nodes, effectively reducing the interaction scope of arbitrary node pairs into a single hop.
- Score: 48.6168145845412
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have been widely adopted for drug discovery with molecular graphs. Nevertheless, current GNNs mainly excel in leveraging short-range interactions (SRI) but struggle to capture long-range interactions (LRI), both of which are crucial for determining molecular properties. To tackle this issue, we propose a method to abstract the collective information of atomic groups into a few $\textit{Neural Atoms}$ by implicitly projecting the atoms of a molecular. Specifically, we explicitly exchange the information among neural atoms and project them back to the atoms' representations as an enhancement. With this mechanism, neural atoms establish the communication channels among distant nodes, effectively reducing the interaction scope of arbitrary node pairs into a single hop. To provide an inspection of our method from a physical perspective, we reveal its connection to the traditional LRI calculation method, Ewald Summation. The Neural Atom can enhance GNNs to capture LRI by approximating the potential LRI of the molecular. We conduct extensive experiments on four long-range graph benchmarks, covering graph-level and link-level tasks on molecular graphs. We achieve up to a 27.32% and 38.27% improvement in the 2D and 3D scenarios, respectively. Empirically, our method can be equipped with an arbitrary GNN to help capture LRI. Code and datasets are publicly available in https://github.com/tmlr-group/NeuralAtom.
Related papers
- Molecular Hypergraph Neural Networks [1.4559839293730863]
Graph neural networks (GNNs) have demonstrated promising performance across various chemistry-related tasks.
We introduce molecular hypergraphs and propose Molecular Hypergraph Neural Networks (MHNN) to predict the optoelectronic properties of organic semiconductors.
MHNN outperforms all baseline models on most tasks of OPV, OCELOTv1 and PCQM4Mv2 datasets.
arXiv Detail & Related papers (2023-12-20T15:56:40Z) - SAF: Smart Aggregation Framework for Revealing Atoms Importance Rank and
Improving Prediction Rates in Drug Discovery [0.0]
A successful approach for representing molecules is to treat them as a graph and utilize graph neural networks.
We propose a novel aggregating approach where each atom is weighted non-linearly using the Boltzmann distribution.
We show that using this weighted aggregation improves the ability of the gold standard message-passing neural network to predict antibiotic activity.
arXiv Detail & Related papers (2023-09-12T22:04:24Z) - HiGNN: Hierarchical Informative Graph Neural Networks for Molecular
Property Prediction Equipped with Feature-Wise Attention [5.735627221409312]
We propose a well-designed hierarchical informative graph neural networks framework (termed HiGNN) for predicting molecular property.
Experiments demonstrate that HiGNN achieves state-of-the-art predictive performance on many challenging drug discovery-associated benchmark datasets.
arXiv Detail & Related papers (2022-08-30T05:16:15Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Distance-aware Molecule Graph Attention Network for Drug-Target Binding
Affinity Prediction [54.93890176891602]
We propose a diStance-aware Molecule graph Attention Network (S-MAN) tailored to drug-target binding affinity prediction.
As a dedicated solution, we first propose a position encoding mechanism to integrate the topological structure and spatial position information into the constructed pocket-ligand graph.
We also propose a novel edge-node hierarchical attentive aggregation structure which has edge-level aggregation and node-level aggregation.
arXiv Detail & Related papers (2020-12-17T17:44:01Z) - Directed Graph Attention Neural Network Utilizing 3D Coordinates for
Molecular Property Prediction [11.726245297344418]
Kernel method and graph neural networks have been widely studied as two mainstream methods for property prediction.
In this work, we shed light on the Directed Graph Attention Neural Network (DGANN), which only takes chemical bonds as edges.
Our model has matched or outperformed most baseline graph neural networks on QM9 datasets.
arXiv Detail & Related papers (2020-12-01T11:06:40Z) - Multi-hop Attention Graph Neural Network [70.21119504298078]
Multi-hop Attention Graph Neural Network (MAGNA) is a principled way to incorporate multi-hop context information into every layer of attention computation.
We show that MAGNA captures large-scale structural information in every layer, and has a low-pass effect that eliminates noisy high-frequency information from graph data.
arXiv Detail & Related papers (2020-09-29T22:41:19Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z) - SkipGNN: Predicting Molecular Interactions with Skip-Graph Networks [70.64925872964416]
We present SkipGNN, a graph neural network approach for the prediction of molecular interactions.
SkipGNN predicts molecular interactions by not only aggregating information from direct interactions but also from second-order interactions.
We show that SkipGNN achieves superior and robust performance, outperforming existing methods by up to 28.8% of area.
arXiv Detail & Related papers (2020-04-30T16:55:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.