Weisfeiler-Lehman Embedding for Molecular Graph Neural Networks
- URL: http://arxiv.org/abs/2006.06909v2
- Date: Tue, 18 Aug 2020 03:12:46 GMT
- Title: Weisfeiler-Lehman Embedding for Molecular Graph Neural Networks
- Authors: Katsuhiko Ishiguro and Kenta Oono and Kohei Hayashi
- Abstract summary: A graph neural network (GNN) is a good choice for predicting the chemical properties of molecules.
In this paper, we expand an atom representation using Weisfeiler-Lehman embedding.
We show WL embedding can replace the first two layers of ReLU GNN with a smaller weight norm.
- Score: 15.662820454886203
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A graph neural network (GNN) is a good choice for predicting the chemical
properties of molecules. Compared with other deep networks, however, the
current performance of a GNN is limited owing to the "curse of depth." Inspired
by long-established feature engineering in the field of chemistry, we expanded
an atom representation using Weisfeiler-Lehman (WL) embedding, which is
designed to capture local atomic patterns dominating the chemical properties of
a molecule. In terms of representability, we show WL embedding can replace the
first two layers of ReLU GNN -- a normal embedding and a hidden GNN layer --
with a smaller weight norm. We then demonstrate that WL embedding consistently
improves the empirical performance over multiple GNN architectures and several
molecular graph datasets.
Related papers
- Spiking Graph Neural Network on Riemannian Manifolds [51.15400848660023]
Graph neural networks (GNNs) have become the dominant solution for learning on graphs.
Existing spiking GNNs consider graphs in Euclidean space, ignoring the structural geometry.
We present a Manifold-valued Spiking GNN (MSG)
MSG achieves superior performance to previous spiking GNNs and energy efficiency to conventional GNNs.
arXiv Detail & Related papers (2024-10-23T15:09:02Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - ChiENN: Embracing Molecular Chirality with Graph Neural Networks [10.19088492223333]
We propose a theoretically justified message-passing scheme, which makes GNNs sensitive to the order of node neighbors.
We apply that concept in the context of molecular chirality to construct Chiral Edge Neural Network layer which can be appended to any GNN model.
Our experiments show that adding ChiENN layers to a GNN outperforms current state-of-the-art methods in chiral-sensitive molecular property prediction tasks.
arXiv Detail & Related papers (2023-07-05T10:50:40Z) - Graph Neural Networks are Inherently Good Generalizers: Insights by
Bridging GNNs and MLPs [71.93227401463199]
This paper pinpoints the major source of GNNs' performance gain to their intrinsic capability, by introducing an intermediate model class dubbed as P(ropagational)MLP.
We observe that PMLPs consistently perform on par with (or even exceed) their GNN counterparts, while being much more efficient in training.
arXiv Detail & Related papers (2022-12-18T08:17:32Z) - HiGNN: Hierarchical Informative Graph Neural Networks for Molecular
Property Prediction Equipped with Feature-Wise Attention [5.735627221409312]
We propose a well-designed hierarchical informative graph neural networks framework (termed HiGNN) for predicting molecular property.
Experiments demonstrate that HiGNN achieves state-of-the-art predictive performance on many challenging drug discovery-associated benchmark datasets.
arXiv Detail & Related papers (2022-08-30T05:16:15Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Expressiveness and Approximation Properties of Graph Neural Networks [6.1323142294300625]
We provide an elegant way to obtain bounds on the separation power of GNNs in terms of the Weisfeiler-Leman (WL) tests.
We use tensor language to define Higher-Order Message-Passing Neural Networks (or k-MPNNs), a natural extension of MPNNs.
Our approach provides a toolbox with which GNN architecture designers can analyze the separation power of their GNNs.
arXiv Detail & Related papers (2022-04-10T11:33:04Z) - Image-Like Graph Representations for Improved Molecular Property
Prediction [7.119677737397071]
We propose a new intrinsic molecular representation that bypasses the need for GNNs entirely, dubbed CubeMol.
Our fixed-dimensional representation, when paired with a transformer model, exceeds the performance of state-of-the-art GNN models and provides a path for scalability.
arXiv Detail & Related papers (2021-11-20T22:39:11Z) - Graph Neural Network Architecture Search for Molecular Property
Prediction [1.0965065178451106]
We develop an NAS approach to automate the design and development of graph neural networks (GNNs) for molecular property prediction.
Specifically, we focus on automated development of message-passing neural networks (MPNNs) to predict the molecular properties of small molecules in quantum mechanics and physical chemistry data sets.
arXiv Detail & Related papers (2020-08-27T15:30:57Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.