Machine-learning-accelerated simulations to enable automatic surface
reconstruction
- URL: http://arxiv.org/abs/2305.07251v2
- Date: Tue, 21 Nov 2023 16:26:36 GMT
- Title: Machine-learning-accelerated simulations to enable automatic surface
reconstruction
- Authors: Xiaochen Du, James K. Damewood, Jaclyn R. Lunger, Reisel Millan, Bilge
Yildiz, Lin Li and Rafael G\'omez-Bombarelli
- Abstract summary: ab initio simulations can in principle predict the structure of material surfaces as a function of thermodynamic variables.
Here, we present a bi-faceted computational loop to predict surface phase diagrams of multi-component materials.
- Score: 2.9599032866864654
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding material surfaces and interfaces is vital in applications like
catalysis or electronics. By combining energies from electronic structure with
statistical mechanics, ab initio simulations can in principle predict the
structure of material surfaces as a function of thermodynamic variables.
However, accurate energy simulations are prohibitive when coupled to the vast
phase space that must be statistically sampled. Here, we present a bi-faceted
computational loop to predict surface phase diagrams of multi-component
materials that accelerates both the energy scoring and statistical sampling
methods. Fast, scalable, and data-efficient machine learning interatomic
potentials are trained on high-throughput density-functional theory
calculations through closed-loop active learning. Markov-chain Monte Carlo
sampling in the semi-grand canonical ensemble is enabled by using virtual
surface sites. The predicted surfaces for GaN(0001), Si(111), and SrTiO3(001)
are in agreement with past work and suggest that the proposed strategy can
model complex material surfaces and discover previously unreported surface
terminations.
Related papers
- Accelerating the prediction of inorganic surfaces with machine learning
interatomic potentials [0.0]
This review focuses on the application of machine learning, predominantly in the form of learned interatomic potentials, to study complex surfaces.
As machine learning algorithms and large datasets on which to train them become more commonplace in materials science, computational methods are poised to become even more predictive and powerful for modeling the complexities of inorganic surfaces at the nanoscale.
arXiv Detail & Related papers (2023-12-18T21:08:13Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Machine learning of hidden variables in multiscale fluid simulation [77.34726150561087]
Solving fluid dynamics equations often requires the use of closure relations that account for missing microphysics.
In our study, a partial differential equation simulator that is end-to-end differentiable is used to train judiciously placed neural networks.
We show that this method enables an equation based approach to reproduce non-linear, large Knudsen number plasma physics.
arXiv Detail & Related papers (2023-06-19T06:02:53Z) - Reliable machine learning potentials based on artificial neural network
for graphene [2.115174610040722]
Special 2D structure of graphene enables it to exhibit a wide range of peculiar material properties.
molecular dynamics (MD) simulations are widely adopted for understanding the microscopic origins of their unique properties.
An artificial neural network based interatomic potential has been developed for graphene to represent the potential energy surface.
arXiv Detail & Related papers (2023-06-12T17:12:08Z) - Towards Complex Dynamic Physics System Simulation with Graph Neural ODEs [75.7104463046767]
This paper proposes a novel learning based simulation model that characterizes the varying spatial and temporal dependencies in particle systems.
We empirically evaluate GNSTODE's simulation performance on two real-world particle systems, Gravity and Coulomb.
arXiv Detail & Related papers (2023-05-21T03:51:03Z) - Ab initio electron-lattice downfolding: potential energy landscapes,
anharmonicity, and molecular dynamics in charge density wave materials [0.0]
Computational challenges arise especially for large systems, long time scales, in nonequilibrium, or in systems with strong correlations.
We show how downfolding approaches facilitate complexity reduction on the electronic side and thereby boost the simulation of electronic properties and nuclear motion.
arXiv Detail & Related papers (2023-03-13T16:41:37Z) - Physical Systems Modeled Without Physical Laws [0.0]
Tree-based machine learning methods can emulate desired outputs without "knowing" the complex backing involved in the simulations.
We specifically focus on predicting specific spatial-temporal data between two simulation outputs and increasing spatial resolution to generalize the physics predictions to finer test grids without the computational costs of repeating the numerical calculation.
arXiv Detail & Related papers (2022-07-26T20:51:20Z) - Learning Large-scale Subsurface Simulations with a Hybrid Graph Network
Simulator [57.57321628587564]
We introduce Hybrid Graph Network Simulator (HGNS) for learning reservoir simulations of 3D subsurface fluid flows.
HGNS consists of a subsurface graph neural network (SGNN) to model the evolution of fluid flows, and a 3D-U-Net to model the evolution of pressure.
Using an industry-standard subsurface flow dataset (SPE-10) with 1.1 million cells, we demonstrate that HGNS is able to reduce the inference time up to 18 times compared to standard subsurface simulators.
arXiv Detail & Related papers (2022-06-15T17:29:57Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z) - Machine learning dynamics of phase separation in correlated electron
magnets [0.0]
We demonstrate machine-learning enabled large-scale dynamical simulations of electronic phase separation in double-exchange system.
Our work paves the way for large-scale dynamical simulations of correlated electron systems using machine-learning models.
arXiv Detail & Related papers (2020-06-07T17:01:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.