Machine learning predictions for local electronic properties of
disordered correlated electron systems
- URL: http://arxiv.org/abs/2204.05967v1
- Date: Tue, 12 Apr 2022 17:28:51 GMT
- Title: Machine learning predictions for local electronic properties of
disordered correlated electron systems
- Authors: Yi-Hsuan Liu, Sheng Zhang, Puhan Zhang, Ting-Kuo Lee, Gia-Wei Chern
- Abstract summary: We present a scalable machine learning (ML) model to predict local electronic properties.
Our approach is based on the locality principle, or the nearsightedness nature, of many-electron systems.
Our work underscores the promising potential of ML methods for multi-scale modeling of correlated electron systems.
- Score: 2.984639473379942
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a scalable machine learning (ML) model to predict local electronic
properties such as on-site electron number and double occupation for disordered
correlated electron systems. Our approach is based on the locality principle,
or the nearsightedness nature, of many-electron systems, which means local
electronic properties depend mainly on the immediate environment. A ML model is
developed to encode this complex dependence of local quantities on the
neighborhood. We demonstrate our approach using the square-lattice
Anderson-Hubbard model, which is a paradigmatic system for studying the
interplay between Mott transition and Anderson localization. We develop a
lattice descriptor based on group-theoretical method to represent the on-site
random potentials within a finite region. The resultant feature variables are
used as input to a multi-layer fully connected neural network, which is trained
from datasets of variational Monte Carlo (VMC) simulations on small systems. We
show that the ML predictions agree reasonably well with the VMC data. Our work
underscores the promising potential of ML methods for multi-scale modeling of
correlated electron systems.
Related papers
- Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Machine learning force-field models for metallic spin glass [4.090038845129619]
We present a scalable machine learning framework for dynamical simulations of metallic spin glasses.
A Behler-Parrinello type neural-network model is developed to accurately and efficiently predict electron-induced local magnetic fields.
arXiv Detail & Related papers (2023-11-28T17:12:03Z) - Electronic Structure Prediction of Multi-million Atom Systems Through Uncertainty Quantification Enabled Transfer Learning [5.4875371069660925]
Ground state electron density -- obtainable using Kohn-Sham Density Functional Theory (KS-DFT) simulations -- contains a wealth of material information.
However, the computational expense of KS-DFT scales cubically with system size which tends to stymie training data generation.
Here, we address this fundamental challenge by employing transfer learning to leverage the multi-scale nature of the training data.
arXiv Detail & Related papers (2023-08-24T21:41:29Z) - Cheap and Deterministic Inference for Deep State-Space Models of
Interacting Dynamical Systems [38.23826389188657]
We present a deep state-space model which employs graph neural networks in order to model the underlying interacting dynamical system.
The predictive distribution is multimodal and has the form of a Gaussian mixture model, where the moments of the Gaussian components can be computed via deterministic moment matching rules.
Our moment matching scheme can be exploited for sample-free inference, leading to more efficient and stable training compared to Monte Carlo alternatives.
arXiv Detail & Related papers (2023-05-02T20:30:23Z) - Combining Machine Learning and Agent-Based Modeling to Study Biomedical
Systems [0.0]
Agent-based modeling (ABM) is a well-established paradigm for simulating complex systems via interactions between constituent entities.
Machine learning (ML) refers to approaches whereby statistical algorithms 'learn from data on their own, without imposing a priori theories of system behavior.
arXiv Detail & Related papers (2022-06-02T15:19:09Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Integrating Expert ODEs into Neural ODEs: Pharmacology and Disease
Progression [71.7560927415706]
latent hybridisation model (LHM) integrates a system of expert-designed ODEs with machine-learned Neural ODEs to fully describe the dynamics of the system.
We evaluate LHM on synthetic data as well as real-world intensive care data of COVID-19 patients.
arXiv Detail & Related papers (2021-06-05T11:42:45Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Machine learning dynamics of phase separation in correlated electron
magnets [0.0]
We demonstrate machine-learning enabled large-scale dynamical simulations of electronic phase separation in double-exchange system.
Our work paves the way for large-scale dynamical simulations of correlated electron systems using machine-learning models.
arXiv Detail & Related papers (2020-06-07T17:01:06Z) - Graph Neural Network for Hamiltonian-Based Material Property Prediction [56.94118357003096]
We present and compare several different graph convolution networks that are able to predict the band gap for inorganic materials.
The models are developed to incorporate two different features: the information of each orbital itself and the interaction between each other.
The results show that our model can get a promising prediction accuracy with cross-validation.
arXiv Detail & Related papers (2020-05-27T13:32:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.