AI-driven Hypergraph Network of Organic Chemistry: Network Statistics
and Applications in Reaction Classification
- URL: http://arxiv.org/abs/2208.01647v2
- Date: Mon, 27 Mar 2023 15:43:43 GMT
- Title: AI-driven Hypergraph Network of Organic Chemistry: Network Statistics
and Applications in Reaction Classification
- Authors: Vipul Mann and Venkat Venkatasubramanian
- Abstract summary: We use a standard reactions dataset to construct a hypernetwork and report its statistics.
We also compute each statistic for an equivalent directed graph representation of reactions to draw parallels and highlight differences.
We conclude that the hypernetwork representation is flexible, preserves reaction context, and uncovers hidden insights.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Rapid discovery of new reactions and molecules in recent years has been
facilitated by the advancements in high throughput screening, accessibility to
a much more complex chemical design space, and the development of accurate
molecular modeling frameworks. A holistic study of the growing chemistry
literature is, therefore, required that focuses on understanding the recent
trends and extrapolating them into possible future trajectories. To this end,
several network theory-based studies have been reported that use a directed
graph representation of chemical reactions. Here, we perform a study based on
representing chemical reactions as hypergraphs where the hyperedges represent
chemical reactions and nodes represent the participating molecules. We use a
standard reactions dataset to construct a hypernetwork and report its
statistics such as degree distributions, average path length, assortativity or
degree correlations, PageRank centrality, and graph-based clusters (or
communities). We also compute each statistic for an equivalent directed graph
representation of reactions to draw parallels and highlight differences between
the two. To demonstrate the AI applicability of hypergraph reaction
representation, we generate dense hypergraph embeddings and use them in the
reaction classification problem. We conclude that the hypernetwork
representation is flexible, preserves reaction context, and uncovers hidden
insights that are otherwise not apparent in a traditional directed graph
representation of chemical reactions.
Related papers
- Learning Chemical Reaction Representation with Reactant-Product Alignment [50.28123475356234]
RAlign is a novel chemical reaction representation learning model for various organic reaction-related tasks.
By integrating atomic correspondence between reactants and products, our model discerns the molecular transformations that occur during the reaction.
We introduce a reaction-center-aware attention mechanism that enables the model to concentrate on key functional groups.
arXiv Detail & Related papers (2024-11-26T17:41:44Z) - Graph-in-Graph (GiG): Learning interpretable latent graphs in
non-Euclidean domain for biological and healthcare applications [52.65389473899139]
Graphs are a powerful tool for representing and analyzing unstructured, non-Euclidean data ubiquitous in the healthcare domain.
Recent works have shown that considering relationships between input data samples have a positive regularizing effect for the downstream task.
We propose Graph-in-Graph (GiG), a neural network architecture for protein classification and brain imaging applications.
arXiv Detail & Related papers (2022-04-01T10:01:37Z) - Rxn Hypergraph: a Hypergraph Attention Model for Chemical Reaction
Representation [70.97737157902947]
There is currently no universal and widely adopted method for robustly representing chemical reactions.
Here we exploit graph-based representations of molecular structures to develop and test a hypergraph attention neural network approach.
We evaluate this hypergraph representation in three experiments using three independent data sets of chemical reactions.
arXiv Detail & Related papers (2022-01-02T12:33:10Z) - Learning Attributed Graph Representations with Communicative Message
Passing Transformer [3.812358821429274]
We propose a Communicative Message Passing Transformer (CoMPT) neural network to improve the molecular graph representation.
Unlike the previous transformer-style GNNs that treat molecules as fully connected graphs, we introduce a message diffusion mechanism to leverage the graph connectivity inductive bias.
arXiv Detail & Related papers (2021-07-19T11:58:32Z) - Graph Neural Networks for the Prediction of Substrate-Specific Organic
Reaction Conditions [79.45090959869124]
We present a systematic investigation using graph neural networks (GNNs) to model organic chemical reactions.
We evaluate seven different GNN architectures for classification tasks pertaining to the identification of experimental reagents and conditions.
arXiv Detail & Related papers (2020-07-08T17:21:00Z) - ASGN: An Active Semi-supervised Graph Neural Network for Molecular
Property Prediction [61.33144688400446]
We propose a novel framework called Active Semi-supervised Graph Neural Network (ASGN) by incorporating both labeled and unlabeled molecules.
In the teacher model, we propose a novel semi-supervised learning method to learn general representation that jointly exploits information from molecular structure and molecular distribution.
At last, we proposed a novel active learning strategy in terms of molecular diversities to select informative data during the whole framework learning.
arXiv Detail & Related papers (2020-07-07T04:22:39Z) - Molecule Edit Graph Attention Network: Modeling Chemical Reactions as
Sequences of Graph Edits [0.0]
We present Molecule Edit Graph Attention Network (MEGAN), an end-to-end encoder-decoder neural model.
MEGAN is inspired by models that express a chemical reaction as a sequence of graph edits, akin to the arrow pushing formalism.
We extend this model to retrosynthesis prediction (predicting substrates given the product of a chemical reaction) and scale it up to large datasets.
arXiv Detail & Related papers (2020-06-27T18:50:24Z) - Retrosynthesis Prediction with Conditional Graph Logic Network [118.70437805407728]
Computer-aided retrosynthesis is finding renewed interest from both chemistry and computer science communities.
We propose a new approach to this task using the Conditional Graph Logic Network, a conditional graphical model built upon graph neural networks.
arXiv Detail & Related papers (2020-01-06T05:36:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.