A Transferable Recommender Approach for Selecting the Best Density
Functional Approximations in Chemical Discovery
- URL: http://arxiv.org/abs/2207.10747v1
- Date: Thu, 21 Jul 2022 20:45:57 GMT
- Title: A Transferable Recommender Approach for Selecting the Best Density
Functional Approximations in Chemical Discovery
- Authors: Chenru Duan, Aditya Nandy, Ralf Meyer, Naveen Arunachalam, and Heather
J. Kulik
- Abstract summary: No single density functional approximation with universal accuracy has been identified, leading to uncertainty in the quality of data generated from DFT.
We build a DFA recommender that selects the DFA with the lowest expected error with respect to gold standard but cost-prohibitive coupled cluster theory.
Our recommender predicts top-performing DFAs and yields excellent accuracy (ca. 2 kcal/mol) for chemical discovery, outperforming both individual transfer learning models and the single best functional in a set of 48 DFAs.
- Score: 0.4063872661554894
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Approximate density functional theory (DFT) has become indispensable owing to
its cost-accuracy trade-off in comparison to more computationally demanding but
accurate correlated wavefunction theory. To date, however, no single density
functional approximation (DFA) with universal accuracy has been identified,
leading to uncertainty in the quality of data generated from DFT. With electron
density fitting and transfer learning, we build a DFA recommender that selects
the DFA with the lowest expected error with respect to gold standard but
cost-prohibitive coupled cluster theory in a system-specific manner. We
demonstrate this recommender approach on vertical spin-splitting energy
evaluation for challenging transition metal complexes. Our recommender predicts
top-performing DFAs and yields excellent accuracy (ca. 2 kcal/mol) for chemical
discovery, outperforming both individual transfer learning models and the
single best functional in a set of 48 DFAs. We demonstrate the transferability
of the DFA recommender to experimentally synthesized compounds with distinct
chemistry.
Related papers
- Machine learning Hubbard parameters with equivariant neural networks [0.0]
We present a machine learning model based on equivariant neural networks.
We target here the prediction of Hubbard parameters computed self-consistently with iterative linear-response calculations.
Our model achieves mean absolute relative errors of 3% and 5% for Hubbard $U$ and $V$ parameters, respectively.
arXiv Detail & Related papers (2024-06-04T16:21:24Z) - Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - Grad DFT: a software library for machine learning enhanced density
functional theory [0.0]
Density functional theory (DFT) stands as a cornerstone in computational quantum chemistry and materials science.
Recent work has begun to explore how machine learning can expand the capabilities of DFT.
We present Grad DFT: a fully differentiable JAX-based DFT library, enabling quick prototyping and experimentation with machine learning-enhanced exchange-correlation energy functionals.
arXiv Detail & Related papers (2023-09-23T00:25:06Z) - On the Theories Behind Hard Negative Sampling for Recommendation [51.64626293229085]
We offer two insightful guidelines for effective usage of Hard Negative Sampling (HNS)
We prove that employing HNS on the Personalized Ranking (BPR) learner is equivalent to optimizing One-way Partial AUC (OPAUC)
These analyses establish the theoretical foundation of HNS in optimizing Top-K recommendation performance for the first time.
arXiv Detail & Related papers (2023-02-07T13:57:03Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Flexible Amortized Variational Inference in qBOLD MRI [56.4324135502282]
Oxygen extraction fraction (OEF) and deoxygenated blood volume (DBV) are more ambiguously determined from the data.
Existing inference methods tend to yield very noisy and underestimated OEF maps, while overestimating DBV.
This work describes a novel probabilistic machine learning approach that can infer plausible distributions of OEF and DBV.
arXiv Detail & Related papers (2022-03-11T10:47:16Z) - Pseudo-Spherical Contrastive Divergence [119.28384561517292]
We propose pseudo-spherical contrastive divergence (PS-CD) to generalize maximum learning likelihood of energy-based models.
PS-CD avoids the intractable partition function and provides a generalized family of learning objectives.
arXiv Detail & Related papers (2021-11-01T09:17:15Z) - Comparison of Optical Response from DFT Random Phase Approximation and
Low-Energy Effective Model: Strained Phosphorene [0.0]
We compare and contrast the dispersive permittivity tensor, using both a low-energy effective model and density functional theory (DFT)
Our results suggest that the random-phase approximation employed in widely used DFT packages should be revisited and improved to be able to predict these fundamental electronic characteristics of a given material with confidence.
arXiv Detail & Related papers (2021-09-01T18:00:06Z) - Machine learning to tame divergent density functional approximations: a
new path to consensus materials design principles [4.700621178941319]
We introduce an approach to rapidly obtain property predictions from 23 representative DFAs spanning multiple families and "rungs"
We train independent ML models for each DFA and observe convergent trends in feature importance.
By requiring consensus of the ANN-predicted DFA properties, we improve correspondence of these computational lead compounds with literature-mined, experimental compounds.
arXiv Detail & Related papers (2021-06-24T15:43:57Z) - Learning the exchange-correlation functional from nature with fully
differentiable density functional theory [0.0]
We train a neural network to replace the exchange-correlation functional within a fully-differentiable three-dimensional Kohn-Sham density functional theory framework.
Our trained exchange-correlation network provided improved prediction of atomization and ionization energies across a collection of 110 molecules.
arXiv Detail & Related papers (2021-02-08T14:25:10Z) - Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees [49.91477656517431]
Quantization-based solvers have been widely adopted in Federated Learning (FL)
No existing methods enjoy all the aforementioned properties.
We propose an intuitively-simple yet theoretically-simple method based on SIGNSGD to bridge the gap.
arXiv Detail & Related papers (2020-02-25T15:12:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.