Generalizability of density functionals learned from differentiable
programming on weakly correlated spin-polarized systems
- URL: http://arxiv.org/abs/2110.14846v1
- Date: Thu, 28 Oct 2021 02:03:04 GMT
- Title: Generalizability of density functionals learned from differentiable
programming on weakly correlated spin-polarized systems
- Authors: Bhupalee Kalita, Ryan Pederson, Li Li, Kieron Burke
- Abstract summary: Kohn-Sham regularizer (KSR) is a machine learning approach that optimize a physics-informed exchange-correlation functional.
We evaluate the generalizability of KSR by training on atomic systems and testing on molecules at equilibrium.
Our nonlocal functional outperforms any existing machine learning functionals by predicting the ground-state energies of the test systems with a mean absolute error of 2.7 milli-Hartrees.
- Score: 2.896251429985507
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kohn-Sham regularizer (KSR) is a machine learning approach that optimizes a
physics-informed exchange-correlation functional within a differentiable
Kohn-Sham density functional theory framework. We evaluate the generalizability
of KSR by training on atomic systems and testing on molecules at equilibrium.
We propose a spin-polarized version of KSR with local, semilocal, and nonlocal
approximations for the exchange-correlation functional. The generalization
error from our semilocal approximation is comparable to other differentiable
approaches. Our nonlocal functional outperforms any existing machine learning
functionals by predicting the ground-state energies of the test systems with a
mean absolute error of 2.7 milli-Hartrees.
Related papers
- Tight Stability, Convergence, and Robustness Bounds for Predictive Coding Networks [60.3634789164648]
Energy-based learning algorithms, such as predictive coding (PC), have garnered significant attention in the machine learning community.
We rigorously analyze the stability, robustness, and convergence of PC through the lens of dynamical systems theory.
arXiv Detail & Related papers (2024-10-07T02:57:26Z) - Variational principle to regularize machine-learned density functionals:
the non-interacting kinetic-energy functional [0.0]
We propose a new and efficient regularization method to train density functionals based on deep neural networks.
The method is tested on (effectively) one-dimensional systems, including the hydrogen chain, non-interacting electrons, and atoms of the first two periods.
For the atomic systems, the generalizability of the regularization method is demonstrated by training also an exchange--correlation functional.
arXiv Detail & Related papers (2023-06-30T12:07:26Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - D4FT: A Deep Learning Approach to Kohn-Sham Density Functional Theory [79.50644650795012]
We propose a deep learning approach to solve Kohn-Sham Density Functional Theory (KS-DFT)
We prove that such an approach has the same expressivity as the SCF method, yet reduces the computational complexity.
In addition, we show that our approach enables us to explore more complex neural-based wave functions.
arXiv Detail & Related papers (2023-03-01T10:38:10Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Electron-Affinity Time-Dependent Density Functional Theory: Formalism
and Applications to Core-Excited States [0.0]
The particle-hole interaction problem is longstanding within time-dependent density functional theory.
We derive a linear-response formalism that uses optimized orbitals of the n-1-electron system as reference.
Our approach is an exact generalization of the static-exchange approximation and reduces errors in TDDFT XAS by orders of magnitude.
arXiv Detail & Related papers (2022-05-17T23:05:15Z) - A Simple and General Debiased Machine Learning Theorem with Finite
Sample Guarantees [4.55274575362193]
We provide a nonasymptotic debiased machine learning theorem that encompasses any global or local functional of any machine learning algorithm.
Our results culminate in a simple set of conditions that an analyst can use to translate modern learning theory rates into traditional statistical inference.
arXiv Detail & Related papers (2021-05-31T17:57:02Z) - Machine Learning Universal Bosonic Functionals [0.0]
A functional theory for bosonic ground states establishes the existence of a universal functional $mathcalF[gamma]$ that recovers quantum correlations exactly.
For the Bose-Hubbard model, we present a comparison between our approach and Quantum Monte Carlo.
arXiv Detail & Related papers (2021-04-07T15:53:10Z) - eQE 2.0: Subsystem DFT Beyond GGA Functionals [58.720142291102135]
subsystem-DFT (sDFT) can dramatically reduce the computational cost of large-scale electronic structure calculations.
The key ingredients of sDFT are the nonadditive kinetic energy and exchange-correlation functionals which dominate it's accuracy.
eQE 2.0 delivers excellent interaction energies compared to conventional Kohn-Sham DFT and CCSD(T)
arXiv Detail & Related papers (2021-03-12T22:26:36Z) - Reducing charge delocalization error of density functional theory [0.0]
The charge delocalization error, besides nondynamic correlation, has been a major challenge to density functional theory.
We extend a functional designed for nondynamic correlation to treat the charge delocalization error by modifying the nondynamic correlation for parallel spins.
Our results are the closest to those of CCSD(T) in the whole range of the dissociation compared with contemporary functionals.
arXiv Detail & Related papers (2021-02-25T16:50:02Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.