Non-autoregressive electron flow generation for reaction prediction
- URL: http://arxiv.org/abs/2012.12124v2
- Date: Fri, 5 Feb 2021 12:03:15 GMT
- Title: Non-autoregressive electron flow generation for reaction prediction
- Authors: Hangrui Bi, Hengyi Wang, Chence Shi, Jian Tang
- Abstract summary: We devise a novel decoder that avoids such sequential generating and predicts the reaction in a Non-Autoregressive manner.
Inspired by physical-chemistry insights, we represent edge edits in a molecule graph as electron flows, which can then be predicted in parallel.
Our model achieves both an order of magnitude lower inference latency, with state-of-the-art top-1 accuracy and comparable performance on Top-K sampling.
- Score: 15.98143959075733
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Reaction prediction is a fundamental problem in computational chemistry.
Existing approaches typically generate a chemical reaction by sampling tokens
or graph edits sequentially, conditioning on previously generated outputs.
These autoregressive generating methods impose an arbitrary ordering of outputs
and prevent parallel decoding during inference. We devise a novel decoder that
avoids such sequential generating and predicts the reaction in a
Non-Autoregressive manner. Inspired by physical-chemistry insights, we
represent edge edits in a molecule graph as electron flows, which can then be
predicted in parallel. To capture the uncertainty of reactions, we introduce
latent variables to generate multi-modal outputs. Following previous works, we
evaluate our model on USPTO MIT dataset. Our model achieves both an order of
magnitude lower inference latency, with state-of-the-art top-1 accuracy and
comparable performance on Top-K sampling.
Related papers
- Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering [55.15192437680943]
Generative models lack rigorous statistical guarantees for their outputs.
We propose a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee.
This guarantee states that with high probability, the prediction sets contain at least one admissible (or valid) example.
arXiv Detail & Related papers (2024-10-02T15:26:52Z) - Accelerating the inference of string generation-based chemical reaction models for industrial applications [25.069344340760715]
We present a method to accelerate inference in autoregressive SMILES generators through speculative decoding.
We achieve over 3X faster inference in reaction prediction and single-step retrosynthesis, with no loss in accuracy.
arXiv Detail & Related papers (2024-07-12T20:55:59Z) - Beyond the Typical: Modeling Rare Plausible Patterns in Chemical Reactions by Leveraging Sequential Mixture-of-Experts [42.9784548283531]
Generative models like Transformer and VAE have typically been employed to predict the reaction product.
We propose organizing the mapping space between reactants and electron redistribution patterns in a divide-and-conquer manner.
arXiv Detail & Related papers (2023-10-07T03:18:26Z) - Doubly Stochastic Graph-based Non-autoregressive Reaction Prediction [59.41636061300571]
We propose a new framework called that combines two doubly self-attention mappings to obtain electron redistribution predictions.
We show that our approach consistently improves the predictive performance of non-autoregressive models.
arXiv Detail & Related papers (2023-06-05T14:15:39Z) - MARS: A Motif-based Autoregressive Model for Retrosynthesis Prediction [54.75583184356392]
We propose a novel end-to-end graph generation model for retrosynthesis prediction.
It sequentially identifies the reaction center, generates the synthons, and adds motifs to the synthons to generate reactants.
Experiments on a benchmark dataset show that the proposed model significantly outperforms previous state-of-the-art algorithms.
arXiv Detail & Related papers (2022-09-27T06:29:35Z) - Non-Autoregressive Electron Redistribution Modeling for Reaction
Prediction [26.007965383304864]
We devise a non-autoregressive learning paradigm that predicts reaction in one shot.
We formulate a reaction as an arbitrary electron flow and predict it with a novel multi-pointer decoding network.
Experiments on the USPTO-MIT dataset show that our approach has established a new state-of-the-art top-1 accuracy.
arXiv Detail & Related papers (2021-06-08T16:39:08Z) - RetroXpert: Decompose Retrosynthesis Prediction like a Chemist [60.463900712314754]
We devise a novel template-free algorithm for automatic retrosynthetic expansion.
Our method disassembles retrosynthesis into two steps.
While outperforming the state-of-the-art baselines, our model also provides chemically reasonable interpretation.
arXiv Detail & Related papers (2020-11-04T04:35:34Z) - An EM Approach to Non-autoregressive Conditional Sequence Generation [49.11858479436565]
Autoregressive (AR) models have been the dominating approach to conditional sequence generation.
Non-autoregressive (NAR) models have been recently proposed to reduce the latency by generating all output tokens in parallel.
This paper proposes a new approach that jointly optimize both AR and NAR models in a unified Expectation-Maximization framework.
arXiv Detail & Related papers (2020-06-29T20:58:57Z) - Retrosynthesis Prediction with Conditional Graph Logic Network [118.70437805407728]
Computer-aided retrosynthesis is finding renewed interest from both chemistry and computer science communities.
We propose a new approach to this task using the Conditional Graph Logic Network, a conditional graphical model built upon graph neural networks.
arXiv Detail & Related papers (2020-01-06T05:36:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.