Learning the exchange-correlation functional from nature with fully
differentiable density functional theory
- URL: http://arxiv.org/abs/2102.04229v1
- Date: Mon, 8 Feb 2021 14:25:10 GMT
- Title: Learning the exchange-correlation functional from nature with fully
differentiable density functional theory
- Authors: Muhammad F. Kasim, Sam M. Vinko
- Abstract summary: We train a neural network to replace the exchange-correlation functional within a fully-differentiable three-dimensional Kohn-Sham density functional theory framework.
Our trained exchange-correlation network provided improved prediction of atomization and ionization energies across a collection of 110 molecules.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Improving the predictive capability of molecular properties in {\it ab
initio} simulations is essential for advanced material discovery. Despite
recent progress making use of machine learning, utilizing deep neural networks
to improve quantum chemistry modelling remains severely limited by the scarcity
and heterogeneity of appropriate experimental data. Here we show how training a
neural network to replace the exchange-correlation functional within a
fully-differentiable three-dimensional Kohn-Sham density functional theory
(DFT) framework can greatly improve simulation accuracy. Using only eight
experimental data points on diatomic molecules, our trained
exchange-correlation network provided improved prediction of atomization and
ionization energies across a collection of 110 molecules when compared with
both commonly used DFT functionals and more expensive coupled cluster
simulations.
Related papers
- Multi-task learning for molecular electronic structure approaching coupled-cluster accuracy [9.81014501502049]
We develop a unified machine learning method for electronic structures of organic molecules using the gold-standard CCSD(T) calculations as training data.
Tested on hydrocarbon molecules, our model outperforms DFT with the widely-used hybrid and double hybrid functionals in computational costs and prediction accuracy of various quantum chemical properties.
arXiv Detail & Related papers (2024-05-09T19:51:27Z) - Quantum-Enhanced Neural Exchange-Correlation Functionals [0.193482901474023]
KohnSham Density Functional Theory (KS-DFT) provides the exact ground state energy and electron density of a molecule, contingent on the asyet unknown universal exchange-correlation (XC) functional.
Recent research has demonstrated that neural networks can efficiently learn to represent approximations to that functional, offering accurate generalizations to molecules not present during the training process.
With the latest advancements in quantum-enhanced machine learning (ML), evidence is growing that Quantum Neural Network (QNN) models may offer advantages in ML applications.
arXiv Detail & Related papers (2024-04-22T15:07:57Z) - A Multi-Grained Symmetric Differential Equation Model for Learning
Protein-Ligand Binding Dynamics [74.93549765488103]
In drug discovery, molecular dynamics simulation provides a powerful tool for predicting binding affinities, estimating transport properties, and exploring pocket sites.
We propose NeuralMD, the first machine learning surrogate that can facilitate numerical MD and provide accurate simulations in protein-ligand binding.
We show the efficiency and effectiveness of NeuralMD, with a 2000$times$ speedup over standard numerical MD simulation and outperforming all other ML approaches by up to 80% under the stability metric.
arXiv Detail & Related papers (2024-01-26T09:35:17Z) - Machine learning enabled experimental design and parameter estimation
for ultrafast spin dynamics [54.172707311728885]
We introduce a methodology that combines machine learning with Bayesian optimal experimental design (BOED)
Our method employs a neural network model for large-scale spin dynamics simulations for precise distribution and utility calculations in BOED.
Our numerical benchmarks demonstrate the superior performance of our method in guiding XPFS experiments, predicting model parameters, and yielding more informative measurements within limited experimental time.
arXiv Detail & Related papers (2023-06-03T06:19:20Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Predictive Scale-Bridging Simulations through Active Learning [43.48102250786867]
We use an active learning approach to optimize the use of local fine-scale simulations for informing coarse-scale hydrodynamics.
Our approach addresses three challenges: forecasting continuum coarse-scale trajectory, dynamically updating coarse-scale from fine-scale calculations, and quantifying uncertainty in neural network models.
arXiv Detail & Related papers (2022-09-20T15:58:50Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - A deep learning driven pseudospectral PCE based FFT homogenization
algorithm for complex microstructures [68.8204255655161]
It is shown that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
It is shown, that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
arXiv Detail & Related papers (2021-10-26T07:02:14Z) - SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate
Interatomic Potentials [0.17590081165362778]
NequIP is a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations.
The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency.
arXiv Detail & Related papers (2021-01-08T18:49:10Z) - Multi-task learning for electronic structure to predict and explore
molecular potential energy surfaces [39.228041052681526]
We refine the OrbNet model to accurately predict energy, forces, and other response properties for molecules.
The model is end-to-end differentiable due to the derivation of analytic gradients for all electronic structure terms.
It is shown to be transferable across chemical space due to the use of domain-specific features.
arXiv Detail & Related papers (2020-11-05T06:48:46Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.