Deep-Learning Density Functional Theory Hamiltonian for Efficient ab
initio Electronic-Structure Calculation
- URL: http://arxiv.org/abs/2104.03786v2
- Date: Thu, 19 May 2022 03:21:09 GMT
- Title: Deep-Learning Density Functional Theory Hamiltonian for Efficient ab
initio Electronic-Structure Calculation
- Authors: He Li, Zun Wang, Nianlong Zou, Meng Ye, Runzhang Xu, Xiaoxun Gong,
Wenhui Duan, Yong Xu
- Abstract summary: We develop a deep neural network approach to represent DFT Hamiltonian (DeepH) of crystalline materials.
The method provides a solution to the accuracy-efficiency dilemma of DFT and opens opportunities to explore large-scale material systems.
- Score: 13.271547916205675
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The marriage of density functional theory (DFT) and deep learning methods has
the potential to revolutionize modern computational materials science. Here we
develop a deep neural network approach to represent DFT Hamiltonian (DeepH) of
crystalline materials, aiming to bypass the computationally demanding
self-consistent field iterations of DFT and substantially improve the
efficiency of ab initio electronic-structure calculations. A general framework
is proposed to deal with the large dimensionality and gauge (or rotation)
covariance of DFT Hamiltonian matrix by virtue of locality and is realized by
the message passing neural network for deep learning. High accuracy, high
efficiency and good transferability of the DeepH method are generally
demonstrated for various kinds of material systems and physical properties. The
method provides a solution to the accuracy-efficiency dilemma of DFT and opens
opportunities to explore large-scale material systems, as evidenced by a
promising application to study twisted van der Waals materials.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - NeuralSCF: Neural network self-consistent fields for density functional theory [1.7667864049272723]
Kohn-Sham density functional theory (KS-DFT) has found widespread application in accurate electronic structure calculations.
We propose a neural network self-consistent fields (NeuralSCF) framework that establishes the Kohn-Sham density map as a deep learning objective.
arXiv Detail & Related papers (2024-06-22T15:24:08Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Grad DFT: a software library for machine learning enhanced density
functional theory [0.0]
Density functional theory (DFT) stands as a cornerstone in computational quantum chemistry and materials science.
Recent work has begun to explore how machine learning can expand the capabilities of DFT.
We present Grad DFT: a fully differentiable JAX-based DFT library, enabling quick prototyping and experimentation with machine learning-enhanced exchange-correlation energy functionals.
arXiv Detail & Related papers (2023-09-23T00:25:06Z) - D4FT: A Deep Learning Approach to Kohn-Sham Density Functional Theory [79.50644650795012]
We propose a deep learning approach to solve Kohn-Sham Density Functional Theory (KS-DFT)
We prove that such an approach has the same expressivity as the SCF method, yet reduces the computational complexity.
In addition, we show that our approach enables us to explore more complex neural-based wave functions.
arXiv Detail & Related papers (2023-03-01T10:38:10Z) - A Deep Learning Approach for the solution of Probability Density
Evolution of Stochastic Systems [0.0]
DeepPDEM utilizes the concept of physics-informed networks to solve the evolution of the probability density.
DeepPDEM learns the General Density Evolution Equation (GDEE) of structures.
It can also serve as an efficient surrogate for the solution at any points within optimization schemes or real-time applica-tions.
arXiv Detail & Related papers (2022-07-05T09:37:48Z) - Non-equilibrium molecular geometries in graph neural networks [2.6040244706888998]
Graph neural networks have become a powerful framework for learning complex structure-property relationships.
Recently proposed methods have demonstrated that using 3D geometry information of the molecule along with the bonding structure can lead to more accurate prediction on a wide range of properties.
arXiv Detail & Related papers (2022-03-07T20:20:52Z) - Credit Assignment in Neural Networks through Deep Feedback Control [59.14935871979047]
Deep Feedback Control (DFC) is a new learning method that uses a feedback controller to drive a deep neural network to match a desired output target and whose control signal can be used for credit assignment.
The resulting learning rule is fully local in space and time and approximates Gauss-Newton optimization for a wide range of connectivity patterns.
To further underline its biological plausibility, we relate DFC to a multi-compartment model of cortical pyramidal neurons with a local voltage-dependent synaptic plasticity rule, consistent with recent theories of dendritic processing.
arXiv Detail & Related papers (2021-06-15T05:30:17Z) - BIGDML: Towards Exact Machine Learning Force Fields for Materials [55.944221055171276]
Machine-learning force fields (MLFF) should be accurate, computationally and data efficient, and applicable to molecules, materials, and interfaces thereof.
Here, we introduce the Bravais-Inspired Gradient-Domain Machine Learning approach and demonstrate its ability to construct reliable force fields using a training set with just 10-200 atoms.
arXiv Detail & Related papers (2021-06-08T10:14:57Z) - Influence Estimation and Maximization via Neural Mean-Field Dynamics [60.91291234832546]
We propose a novel learning framework using neural mean-field (NMF) dynamics for inference and estimation problems.
Our framework can simultaneously learn the structure of the diffusion network and the evolution of node infection probabilities.
arXiv Detail & Related papers (2021-06-03T00:02:05Z) - Accelerating Finite-temperature Kohn-Sham Density Functional Theory with
Deep Neural Networks [2.7035666571881856]
We present a numerical modeling workflow based on machine learning (ML) which reproduces the the total energies produced by Kohn-Sham density functional theory (DFT) at finite electronic temperature.
Based on deep neural networks, our workflow yields the local density of states (LDOS) for a given atomic configuration.
We demonstrate the efficacy of this approach for both solid and liquid metals and compare results between independent and unified machine-learning models for solid and liquid aluminum.
arXiv Detail & Related papers (2020-10-10T05:38:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.