Infusing Self-Consistency into Density Functional Theory Hamiltonian Prediction via Deep Equilibrium Models
- URL: http://arxiv.org/abs/2406.03794v1
- Date: Thu, 6 Jun 2024 07:05:58 GMT
- Title: Infusing Self-Consistency into Density Functional Theory Hamiltonian Prediction via Deep Equilibrium Models
- Authors: Zun Wang, Chang Liu, Nianlong Zou, He Zhang, Xinran Wei, Lin Huang, Lijun Wu, Bin Shao,
- Abstract summary: We introduce a unified neural network architecture, the Deep Equilibrium Density Functional Theory Hamiltonian (DEQH) model.
The DEQH model inherently captures the self-consistency nature of Hamiltonian.
We propose a versatile framework that combines DEQ with off-the-shelf machine learning models for predicting Hamiltonians.
- Score: 30.746062388701187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this study, we introduce a unified neural network architecture, the Deep Equilibrium Density Functional Theory Hamiltonian (DEQH) model, which incorporates Deep Equilibrium Models (DEQs) for predicting Density Functional Theory (DFT) Hamiltonians. The DEQH model inherently captures the self-consistency nature of Hamiltonian, a critical aspect often overlooked by traditional machine learning approaches for Hamiltonian prediction. By employing DEQ within our model architecture, we circumvent the need for DFT calculations during the training phase to introduce the Hamiltonian's self-consistency, thus addressing computational bottlenecks associated with large or complex systems. We propose a versatile framework that combines DEQ with off-the-shelf machine learning models for predicting Hamiltonians. When benchmarked on the MD17 and QH9 datasets, DEQHNet, an instantiation of the DEQH framework, has demonstrated a significant improvement in prediction accuracy. Beyond a predictor, the DEQH model is a Hamiltonian solver, in the sense that it uses the fixed-point solving capability of the deep equilibrium model to iteratively solve for the Hamiltonian. Ablation studies of DEQHNet further elucidate the network's effectiveness, offering insights into the potential of DEQ-integrated networks for Hamiltonian learning.
Related papers
- CogDPM: Diffusion Probabilistic Models via Cognitive Predictive Coding [62.075029712357]
This work introduces the Cognitive Diffusion Probabilistic Models (CogDPM)
CogDPM features a precision estimation method based on the hierarchical sampling capabilities of diffusion models and weight the guidance with precision weights estimated by the inherent property of diffusion models.
We apply CogDPM to real-world prediction tasks using the United Kindom precipitation and surface wind datasets.
arXiv Detail & Related papers (2024-05-03T15:54:50Z) - Self-Consistency Training for Density-Functional-Theory Hamiltonian Prediction [74.84850523400873]
We show that Hamiltonian prediction possesses a self-consistency principle, based on which we propose self-consistency training.
It enables the model to be trained on a large amount of unlabeled data, hence addresses the data scarcity challenge.
It is more efficient than running DFT to generate labels for supervised training, since it amortizes DFT calculation over a set of queries.
arXiv Detail & Related papers (2024-03-14T16:52:57Z) - Hamiltonian truncation tensor networks for quantum field theories [42.2225785045544]
We introduce a tensor network method for the classical simulation of continuous quantum field theories.
The method is built on Hamiltonian truncation and tensor network techniques.
One of the key developments is the exact construction of matrix product state representations of global projectors.
arXiv Detail & Related papers (2023-12-19T19:00:02Z) - Separable Hamiltonian Neural Networks [1.8674308456443722]
Hamiltonian neural networks (HNNs) are state-of-the-art models that regress the vector field of a dynamical system under the learning bias of Hamilton's equations.
We propose separable HNNs that embed additive separability within HNNs using observational, learning, and inductive biases.
arXiv Detail & Related papers (2023-09-03T03:54:43Z) - QH9: A Quantum Hamiltonian Prediction Benchmark for QM9 Molecules [69.25826391912368]
We generate a new Quantum Hamiltonian dataset, named as QH9, to provide precise Hamiltonian matrices for 999 or 2998 molecular dynamics trajectories.
We show that current machine learning models have the capacity to predict Hamiltonian matrices for arbitrary molecules.
arXiv Detail & Related papers (2023-06-15T23:39:07Z) - Physics-Informed Learning Using Hamiltonian Neural Networks with Output
Error Noise Models [0.0]
Hamiltonian Neural Networks (HNNs) implement Hamiltonian theory in deep learning.
This paper introduces an Output Error Hamiltonian Neural Network (OE-HNN) modeling approach to address the modeling of physical systems.
arXiv Detail & Related papers (2023-05-02T11:34:53Z) - Real-time simulations of transmon systems with time-dependent
Hamiltonian models [0.0]
We study aspects of Hamiltonian models which can affect the time evolution of transmon systems.
We denote the corresponding computer models as non-ideal gate-based quantum computer (NIGQC) models.
arXiv Detail & Related papers (2023-04-21T14:58:49Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred
from Vision [73.26414295633846]
A recently proposed class of models attempts to learn latent dynamics from high-dimensional observations.
Existing methods rely on image reconstruction quality, which does not always reflect the quality of the learnt latent dynamics.
We develop a set of new measures, including a binary indicator of whether the underlying Hamiltonian dynamics have been faithfully captured.
arXiv Detail & Related papers (2021-11-10T23:26:58Z) - Learning CHARME models with neural networks [1.5362025549031046]
We consider a model called CHARME (Conditional Heteroscedastic Autoregressive Mixture of Experts)
As an application, we develop a learning theory for the NN-based autoregressive functions of the model.
arXiv Detail & Related papers (2020-02-08T21:51:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.