Persistent homology-based descriptor for machine-learning potential of
amorphous structures
- URL: http://arxiv.org/abs/2206.13727v3
- Date: Mon, 22 May 2023 02:11:59 GMT
- Title: Persistent homology-based descriptor for machine-learning potential of
amorphous structures
- Authors: Emi Minamitani, Ippei Obayashi, Koji Shimizu, Satoshi Watanabe
- Abstract summary: High-accuracy prediction of the physical properties of amorphous materials is challenging in condensed-matter physics.
A promising method to achieve this is machine-learning potentials, which is an alternative to computationally demanding ab initio calculations.
We propose a novel descriptor based on a persistence diagram (PD), a two-dimensional representation of persistent homology (PH)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: High-accuracy prediction of the physical properties of amorphous materials is
challenging in condensed-matter physics. A promising method to achieve this is
machine-learning potentials, which is an alternative to computationally
demanding ab initio calculations. When applying machine-learning potentials,
the construction of descriptors to represent atomic configurations is crucial.
These descriptors should be invariant to symmetry operations. Handcrafted
representations using a smooth overlap of atomic positions and graph neural
networks (GNN) are examples of methods used for constructing symmetry-invariant
descriptors. In this study, we propose a novel descriptor based on a
persistence diagram (PD), a two-dimensional representation of persistent
homology (PH). First, we demonstrated that the normalized two-dimensional
histogram obtained from PD could predict the average energy per atom of
amorphous carbon (aC) at various densities, even when using a simple model.
Second, an analysis of the dimensional reduction results of the descriptor
spaces revealed that PH can be used to construct descriptors with
characteristics similar to those of a latent space in a GNN. These results
indicate that PH is a promising method for constructing descriptors suitable
for machine-learning potentials without hyperparameter tuning and deep-learning
techniques.
Related papers
- Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Transferable E(3) equivariant parameterization for Hamiltonian of
molecules and solids [5.512295869673147]
We develop an E(3) equivariant neural network called HamNet to predict the ab initio tight-binding Hamiltonian of molecules and solids.
The proposed framework provides a general transferable model for accelerating electronic structure calculations.
arXiv Detail & Related papers (2022-10-28T14:56:24Z) - Atomic structure generation from reconstructing structural fingerprints [1.2128971613239876]
We present an end-to-end structure generation approach using atom-centered symmetry functions as the representation and conditional variational autoencoders as the generative model.
We are able to successfully generate novel and valid atomic structures of sub-nanometer Pt nanoparticles as a proof of concept.
arXiv Detail & Related papers (2022-07-27T00:42:59Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - A deep learning driven pseudospectral PCE based FFT homogenization
algorithm for complex microstructures [68.8204255655161]
It is shown that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
It is shown, that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
arXiv Detail & Related papers (2021-10-26T07:02:14Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - Optimal radial basis for density-based atomic representations [58.720142291102135]
We discuss how to build an adaptive, optimal numerical basis that is chosen to represent most efficiently the structural diversity of the dataset at hand.
For each training dataset, this optimal basis is unique, and can be computed at no additional cost with respect to the primitive basis.
We demonstrate that this construction yields representations that are accurate and computationally efficient.
arXiv Detail & Related papers (2021-05-18T17:57:08Z) - OrbNet: Deep Learning for Quantum Chemistry Using Symmetry-Adapted
Atomic-Orbital Features [42.96944345045462]
textscOrbNet is shown to outperform existing methods in terms of learning efficiency and transferability.
For applications to datasets of drug-like molecules, textscOrbNet predicts energies within chemical accuracy of DFT at a computational cost that is thousand-fold or more reduced.
arXiv Detail & Related papers (2020-07-15T22:38:41Z) - Lorentz Group Equivariant Neural Network for Particle Physics [58.56031187968692]
We present a neural network architecture that is fully equivariant with respect to transformations under the Lorentz group.
For classification tasks in particle physics, we demonstrate that such an equivariant architecture leads to drastically simpler models that have relatively few learnable parameters.
arXiv Detail & Related papers (2020-06-08T17:54:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.