A Transformer Model for Predicting Chemical Reaction Products from Generic Templates
- URL: http://arxiv.org/abs/2503.05810v2
- Date: Tue, 11 Mar 2025 08:22:15 GMT
- Title: A Transformer Model for Predicting Chemical Reaction Products from Generic Templates
- Authors: Derin Ozer, Sylvain Lamprier, Thomas Cauchy, Nicolas Gutowski, Benoit Da Mota,
- Abstract summary: This work proposes the Broad Reaction Set (BRS), a dataset featuring 20 generic reaction templates.<n>A T5 model tailored to chemistry is introduced that achieves a balance between rigid templates and template-free methods.<n>ProPreT5 demonstrates its capability to generate accurate, valid, and realistic reaction products.
- Score: 9.242763440219626
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The accurate prediction of chemical reaction outcomes is a major challenge in computational chemistry. Current models rely heavily on either highly specific reaction templates or template-free methods, both of which present limitations. To address these limitations, this work proposes the Broad Reaction Set (BRS), a dataset featuring 20 generic reaction templates that allow for the efficient exploration of the chemical space. Additionally, ProPreT5 is introduced, a T5 model tailored to chemistry that achieves a balance between rigid templates and template-free methods. ProPreT5 demonstrates its capability to generate accurate, valid, and realistic reaction products, making it a promising solution that goes beyond the current state-of-the-art on the complex reaction product prediction task.
Related papers
- Learning Chemical Reaction Representation with Reactant-Product Alignment [50.28123475356234]
RAlign is a novel chemical reaction representation learning model for various organic reaction-related tasks.<n>By integrating atomic correspondence between reactants and products, our model discerns the molecular transformations that occur during the reaction.<n>We introduce a reaction-center-aware attention mechanism that enables the model to concentrate on key functional groups.
arXiv Detail & Related papers (2024-11-26T17:41:44Z) - BatGPT-Chem: A Foundation Large Model For Retrosynthesis Prediction [65.93303145891628]
BatGPT-Chem is a large language model with 15 billion parameters, tailored for enhanced retrosynthesis prediction.
Our model captures a broad spectrum of chemical knowledge, enabling precise prediction of reaction conditions.
This development empowers chemists to adeptly address novel compounds, potentially expediting the innovation cycle in drug manufacturing and materials science.
arXiv Detail & Related papers (2024-08-19T05:17:40Z) - UAlign: Pushing the Limit of Template-free Retrosynthesis Prediction with Unsupervised SMILES Alignment [51.49238426241974]
This paper introduces UAlign, a template-free graph-to-sequence pipeline for retrosynthesis prediction.
By combining graph neural networks and Transformers, our method can more effectively leverage the inherent graph structure of molecules.
arXiv Detail & Related papers (2024-03-25T03:23:03Z) - Assessing the Extrapolation Capability of Template-Free Retrosynthesis
Models [0.7770029179741429]
We empirically assess the extrapolation capability of state-of-the-art template-free models by meticulously assembling an extensive set of out-of-distribution (OOD) reactions.
Our findings demonstrate that while template-free models exhibit potential in predicting synthesis with novel rules, their top-10 exact-match accuracy in OOD reactions is strikingly modest.
arXiv Detail & Related papers (2024-02-29T00:48:17Z) - ReactionT5: a large-scale pre-trained model towards application of
limited reaction data [4.206175795966693]
Transformer-based deep neural networks have revolutionized the field of molecular-related prediction tasks by treating molecules as symbolic sequences.
We propose ReactionT5, a novel model that leverages pretraining on the Open Reaction Database (ORD), a publicly available large-scale resource.
We further fine-tune this model for yield prediction and product prediction tasks, demonstrating its impressive performance even with limited fine-tuning data.
arXiv Detail & Related papers (2023-11-12T02:25:00Z) - Molecule-Edit Templates for Efficient and Accurate Retrosynthesis
Prediction [0.16070833439280313]
We introduce METRO, a machine-learning model that predicts reactions using minimal templates.
We achieve state-of-the-art results on standard benchmarks.
arXiv Detail & Related papers (2023-10-11T09:00:02Z) - RetroComposer: Discovering Novel Reactions by Composing Templates for
Retrosynthesis Prediction [63.14937611038264]
We propose an innovative retrosynthesis prediction framework that can compose novel templates beyond training templates.
Experimental results show that our method can produce novel templates for 328 test reactions in the USPTO-50K dataset.
arXiv Detail & Related papers (2021-12-20T05:57:07Z) - Non-Autoregressive Electron Redistribution Modeling for Reaction
Prediction [26.007965383304864]
We devise a non-autoregressive learning paradigm that predicts reaction in one shot.
We formulate a reaction as an arbitrary electron flow and predict it with a novel multi-pointer decoding network.
Experiments on the USPTO-MIT dataset show that our approach has established a new state-of-the-art top-1 accuracy.
arXiv Detail & Related papers (2021-06-08T16:39:08Z) - RetCL: A Selection-based Approach for Retrosynthesis via Contrastive
Learning [107.64562550844146]
Retrosynthesis is an emerging research area of deep learning.
We propose a new approach that reformulating retrosynthesis into a selection problem of reactants from a candidate set of commercially available molecules.
For learning the score functions, we also propose a novel contrastive training scheme with hard negative mining.
arXiv Detail & Related papers (2021-05-03T12:47:57Z) - Learning Graph Models for Retrosynthesis Prediction [90.15523831087269]
Retrosynthesis prediction is a fundamental problem in organic synthesis.
This paper introduces a graph-based approach that capitalizes on the idea that the graph topology of precursor molecules is largely unaltered during a chemical reaction.
Our model achieves a top-1 accuracy of $53.7%$, outperforming previous template-free and semi-template-based methods.
arXiv Detail & Related papers (2020-06-12T09:40:42Z) - Retrosynthesis Prediction with Conditional Graph Logic Network [118.70437805407728]
Computer-aided retrosynthesis is finding renewed interest from both chemistry and computer science communities.
We propose a new approach to this task using the Conditional Graph Logic Network, a conditional graphical model built upon graph neural networks.
arXiv Detail & Related papers (2020-01-06T05:36:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.