Efficient Transition State Searches by Freezing String Method with Graph Neural Network Potentials
- URL: http://arxiv.org/abs/2501.06159v1
- Date: Fri, 10 Jan 2025 18:32:05 GMT
- Title: Efficient Transition State Searches by Freezing String Method with Graph Neural Network Potentials
- Authors: Jonah Marks, Joseph Gomes,
- Abstract summary: We develop and fine-tune a graph neural network potential energy function suitable for describing organic chemical reactions.<n>We successfully refine guess structures and locate a transition state in each test system considered and reduce the average number of ab-initio calculations by 47%.
- Score: 0.34530027457862006
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transition states are a critical bottleneck in chemical transformations. Significant efforts have been made to develop algorithms that efficiently locate transition states on potential energy surfaces. However, the computational cost of ab-initio potential energy surface evaluation limits the size of chemical systems that can routinely studied. In this work, we develop and fine-tune a graph neural network potential energy function suitable for describing organic chemical reactions and use it to rapidly identify transition state guess structures. We successfully refine guess structures and locate a transition state in each test system considered and reduce the average number of ab-initio calculations by 47% though use of the graph neural network potential energy function. Our results show that modern machine learning models have reached levels of reliability whereby they can be used to accelerate routine computational chemistry tasks.
Related papers
- Accurate Ab-initio Neural-network Solutions to Large-Scale Electronic Structure Problems [52.19558333652367]
We present finite-range embeddings (FiRE) for accurate large-scale ab-initio electronic structure calculations.
FiRE reduces the complexity of neural-network variational Monte Carlo (NN-VMC) by $sim ntextel$, the number of electrons.
We validate our method's accuracy on various challenging systems, including biochemical compounds and organometallic compounds.
arXiv Detail & Related papers (2025-04-08T14:28:54Z) - Thermodynamic Bound on Energy and Negentropy Costs of Inference in Deep Neural Networks [0.0]
The fundamental thermodynamic bound is derived for the energy cost of inference in Deep Neural Networks (DNNs)
We show that the linear operations in DNNs can, in principle, be performed reversibly, whereas the non-linear activation functions impose an unavoidable energy cost.
arXiv Detail & Related papers (2025-03-13T02:35:07Z) - Neural Network Emulator for Atmospheric Chemical ODE [6.84242299603086]
We propose a Neural Network Emulator for fast chemical concentration modeling.
To extract the hidden correlations between initial states and future time evolution, we propose ChemNNE.
Our approach achieves state-of-the-art performance in modeling accuracy and computational speed.
arXiv Detail & Related papers (2024-08-03T17:43:10Z) - Neural Pfaffians: Solving Many Many-Electron Schrödinger Equations [58.130170155147205]
Neural wave functions accomplished unprecedented accuracies in approximating the ground state of many-electron systems, though at a high computational cost.
Recent works proposed amortizing the cost by learning generalized wave functions across different structures and compounds instead of solving each problem independently.
This work tackles the problem by defining overparametrized, fully learnable neural wave functions suitable for generalization across molecules.
arXiv Detail & Related papers (2024-05-23T16:30:51Z) - Accurate Computation of Quantum Excited States with Neural Networks [4.99320937849508]
We present a variational Monte Carlo algorithm for estimating the lowest excited states of a quantum system.
Our method is the first deep learning approach to achieve accurate vertical excitation energies on benzene-scale molecules.
We expect this technique will be of great interest for applications to atomic, nuclear and condensed matter physics.
arXiv Detail & Related papers (2023-08-31T16:27:08Z) - Globally Optimal Training of Neural Networks with Threshold Activation
Functions [63.03759813952481]
We study weight decay regularized training problems of deep neural networks with threshold activations.
We derive a simplified convex optimization formulation when the dataset can be shattered at a certain layer of the network.
arXiv Detail & Related papers (2023-03-06T18:59:13Z) - Energy Transformer [64.22957136952725]
Our work combines aspects of three promising paradigms in machine learning, namely, attention mechanism, energy-based models, and associative memory.
We propose a novel architecture, called the Energy Transformer (or ET for short), that uses a sequence of attention layers that are purposely designed to minimize a specifically engineered energy function.
arXiv Detail & Related papers (2023-02-14T18:51:22Z) - Graph Neural Networks with Trainable Adjacency Matrices for Fault
Diagnosis on Multivariate Sensor Data [69.25738064847175]
It is necessary to consider the behavior of the signals in each sensor separately, to take into account their correlation and hidden relationships with each other.
The graph nodes can be represented as data from the different sensors, and the edges can display the influence of these data on each other.
It was proposed to construct a graph during the training of graph neural network. This allows to train models on data where the dependencies between the sensors are not known in advance.
arXiv Detail & Related papers (2022-10-20T11:03:21Z) - Learned Force Fields Are Ready For Ground State Catalyst Discovery [60.41853574951094]
We present evidence that learned density functional theory (DFT'') force fields are ready for ground state catalyst discovery.
Key finding is that relaxation using forces from a learned potential yields structures with similar or lower energy to those relaxed using the RPBE functional in over 50% of evaluated systems.
We show that a force field trained on a locally harmonic energy surface with the same minima as a target DFT energy is also able to find lower or similar energy structures in over 50% of cases.
arXiv Detail & Related papers (2022-09-26T07:16:43Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Does the brain function as a quantum phase computer using phase ternary
computation? [0.0]
We provide evidence that the fundamental basis of nervous communication is derived from a pressure pulse/soliton capable of computation.
We demonstrate that the contemporary theory of nerve conduction based on cable theory is inappropriate to account for the short computational time necessary.
Deconstruction of the brain neural network suggests that it is a member of a group of Quantum phase computers of which the Turing machine is the simplest.
arXiv Detail & Related papers (2020-12-04T08:00:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.