Learning Potential Energy Surfaces of Hydrogen Atom Transfer Reactions in Peptides
- URL: http://arxiv.org/abs/2508.00578v1
- Date: Fri, 01 Aug 2025 12:21:49 GMT
- Title: Learning Potential Energy Surfaces of Hydrogen Atom Transfer Reactions in Peptides
- Authors: Marlen Neubert, Patrick Reiser, Frauke Gräter, Pascal Friederich,
- Abstract summary: Hydrogen atom transfer (HAT) reactions are essential in many biological processes, such as radical migration in damaged proteins.<n>Machine-learned potentials offer an alternative, able to learn potential energy surfaces with near-quantum accuracy.<n>Here, we systematically generate HAT configurations in peptides to build large datasets using semiempirical methods and DFT.
- Score: 0.562479170374811
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hydrogen atom transfer (HAT) reactions are essential in many biological processes, such as radical migration in damaged proteins, but their mechanistic pathways remain incompletely understood. Simulating HAT is challenging due to the need for quantum chemical accuracy at biologically relevant scales; thus, neither classical force fields nor DFT-based molecular dynamics are applicable. Machine-learned potentials offer an alternative, able to learn potential energy surfaces (PESs) with near-quantum accuracy. However, training these models to generalize across diverse HAT configurations, especially at radical positions in proteins, requires tailored data generation and careful model selection. Here, we systematically generate HAT configurations in peptides to build large datasets using semiempirical methods and DFT. We benchmark three graph neural network architectures (SchNet, Allegro, and MACE) on their ability to learn HAT PESs and indirectly predict reaction barriers from energy predictions. MACE consistently outperforms the others in energy, force, and barrier prediction, achieving a mean absolute error of 1.13 kcal/mol on out-of-distribution DFT barrier predictions. This accuracy enables integration of ML potentials into large-scale collagen simulations to compute reaction rates from predicted barriers, advancing mechanistic understanding of HAT and radical migration in peptides. We analyze scaling laws, model transferability, and cost-performance trade-offs, and outline strategies for improvement by combining ML potentials with transition state search algorithms and active learning. Our approach is generalizable to other biomolecular systems, enabling quantum-accurate simulations of chemical reactivity in complex environments.
Related papers
- Coupled reaction and diffusion governing interface evolution in solid-state batteries [4.707991478885645]
We conduct large-scale explicit reactive simulations with quantum accuracy for a symmetric battery cell.<n>We explain experimental observations of the SEI formations and elucidate the Li creep mechanisms, critical to dendrite initiation.<n>Our approach is to crease a digital twin from first principles, without adjustable parameters fitted to experiment.
arXiv Detail & Related papers (2025-06-12T17:49:05Z) - Ab-initio simulation of excited-state potential energy surfaces with transferable deep quantum Monte Carlo [4.437335677401287]
We introduce a novel method for the geometrically transferable optimization of neural network wave functions.<n>Our method enables the efficient prediction of ground and excited-state PESs and their intersections at the highest accuracy.<n>We validate our approach on three challenging excited-state PESs, including ethylene, the carbon dimer, and the methylenimmonium cation.
arXiv Detail & Related papers (2025-03-25T17:12:29Z) - Predicting ionic conductivity in solids from the machine-learned potential energy landscape [68.25662704255433]
We propose an approach for the quick and reliable screening of ionic conductors through the analysis of a universal interatomic potential.<n>Eight out of the ten highest-ranked materials are confirmed to be superionic at room temperature in first-principles calculations.<n>Our method achieves a speed-up factor of approximately 50 compared to molecular dynamics driven by a machine-learning potential, and is at least 3,000 times faster compared to first-principles molecular dynamics.
arXiv Detail & Related papers (2024-11-11T09:01:36Z) - Molecule Design by Latent Prompt Transformer [76.2112075557233]
This work explores the challenging problem of molecule design by framing it as a conditional generative modeling task.
We propose a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution; (2) a molecule generation model based on a causal Transformer, which uses the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target properties and/or constraint values using the latent prompt.
arXiv Detail & Related papers (2024-02-27T03:33:23Z) - Denoise Pretraining on Nonequilibrium Molecules for Accurate and
Transferable Neural Potentials [8.048439531116367]
We propose denoise pretraining on nonequilibrium molecular conformations to achieve more accurate and transferable GNN potential predictions.
Our models pretrained on small molecules demonstrate remarkable transferability, improving performance when fine-tuned on diverse molecular systems.
arXiv Detail & Related papers (2023-03-03T21:15:22Z) - Conditional Generative Models for Simulation of EMG During Naturalistic
Movements [45.698312905115955]
We present a conditional generative neural network trained adversarially to generate motor unit activation potential waveforms.
We demonstrate the ability of such a model to predictively interpolate between a much smaller number of numerical model's outputs with a high accuracy.
arXiv Detail & Related papers (2022-11-03T14:49:02Z) - Transition1x -- a Dataset for Building Generalizable Reactive Machine
Learning Potentials [7.171984408392421]
We present the dataset Transition1x containing 9.6 million Density Functional Theory (DFT) calculations.
We show that ML models cannot learn features in transition-state regions solely by training on hitherto popular benchmark datasets.
arXiv Detail & Related papers (2022-07-25T07:30:14Z) - NeuralNEB -- Neural Networks can find Reaction Paths Fast [7.7365628406567675]
Quantum mechanical methods like Density Functional Theory (DFT) are used with great success alongside efficient search algorithms for studying kinetics of reactive systems.
Machine Learning (ML) models have turned out to be excellent emulators of small molecule DFT calculations and could possibly replace DFT in such tasks.
In this paper we train state of the art equivariant Graph Neural Network (GNN)-based models on around 10.000 elementary reactions from the Transition1x dataset.
arXiv Detail & Related papers (2022-07-20T15:29:45Z) - Exploring accurate potential energy surfaces via integrating variational
quantum eigensovler with machine learning [8.19234058079321]
We show in this work that variational quantum algorithms can be integrated with machine learning (ML) techniques.
We encode the molecular geometry information into a deep neural network (DNN) for representing parameters of the variational quantum eigensolver (VQE)
arXiv Detail & Related papers (2022-06-08T01:43:56Z) - Accurate Machine Learned Quantum-Mechanical Force Fields for
Biomolecular Simulations [51.68332623405432]
Molecular dynamics (MD) simulations allow atomistic insights into chemical and biological processes.
Recently, machine learned force fields (MLFFs) emerged as an alternative means to execute MD simulations.
This work proposes a general approach to constructing accurate MLFFs for large-scale molecular simulations.
arXiv Detail & Related papers (2022-05-17T13:08:28Z) - Improving Molecular Representation Learning with Metric
Learning-enhanced Optimal Transport [49.237577649802034]
We develop a novel optimal transport-based algorithm termed MROT to enhance their generalization capability for molecular regression problems.
MROT significantly outperforms state-of-the-art models, showing promising potential in accelerating the discovery of new substances.
arXiv Detail & Related papers (2022-02-13T04:56:18Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z) - Benchmarking adaptive variational quantum eigensolvers [63.277656713454284]
We benchmark the accuracy of VQE and ADAPT-VQE to calculate the electronic ground states and potential energy curves.
We find both methods provide good estimates of the energy and ground state.
gradient-based optimization is more economical and delivers superior performance than analogous simulations carried out with gradient-frees.
arXiv Detail & Related papers (2020-11-02T19:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.