Towards End-to-End Structure Solutions from Information-Compromised
Diffraction Data via Generative Deep Learning
- URL: http://arxiv.org/abs/2312.15136v1
- Date: Sat, 23 Dec 2023 02:17:27 GMT
- Title: Towards End-to-End Structure Solutions from Information-Compromised
Diffraction Data via Generative Deep Learning
- Authors: Gabe Guo, Judah Goldfeder, Ling Lan, Aniv Ray, Albert Hanming Yang,
Boyuan Chen, Simon JL Billinge, Hod Lipson
- Abstract summary: Machine learning (ML) and deep learning (DL) are promising approaches since they augment information in the degraded input signal with prior knowledge learned from large databases of already known structures.
Here we present a novel ML approach, a variational query-based multi-branch deep neural network that has the promise to be a robust but general tool to address this problem end-to-end.
The system achieves up to $93.4%$ average similarity with the ground truth on unseen materials, both with known and partially-known chemical composition information.
- Score: 6.617784410952713
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The revolution in materials in the past century was built on a knowledge of
the atomic arrangements and the structure-property relationship. The sine qua
non for obtaining quantitative structural information is single crystal
crystallography. However, increasingly we need to solve structures in cases
where the information content in our input signal is significantly degraded,
for example, due to orientational averaging of grains, finite size effects due
to nanostructure, and mixed signals due to sample heterogeneity. Understanding
the structure property relationships in such situations is, if anything, more
important and insightful, yet we do not have robust approaches for
accomplishing it. In principle, machine learning (ML) and deep learning (DL)
are promising approaches since they augment information in the degraded input
signal with prior knowledge learned from large databases of already known
structures. Here we present a novel ML approach, a variational query-based
multi-branch deep neural network that has the promise to be a robust but
general tool to address this problem end-to-end. We demonstrate the approach on
computed powder x-ray diffraction (PXRD), along with partial chemical
composition information, as input. We choose as a structural representation a
modified electron density we call the Cartesian mapped electron density (CMED),
that straightforwardly allows our ML model to learn material structures across
different chemistries, symmetries and crystal systems. When evaluated on
theoretically simulated data for the cubic and trigonal crystal systems, the
system achieves up to $93.4\%$ average similarity with the ground truth on
unseen materials, both with known and partially-known chemical composition
information, showing great promise for successful structure solution even from
degraded and incomplete input data.
Related papers
- Ab Initio Structure Solutions from Nanocrystalline Powder Diffraction Data [4.463003012243322]
A major challenge in materials science is the determination of the structure of nanometer sized objects.
We present a novel approach that uses a generative machine learning model based on diffusion processes that is trained on 45,229 known structures.
We find that our model, PXRDnet, can successfully solve simulated nanocrystals as small as 10 angstroms across 200 materials of varying symmetry and complexity.
arXiv Detail & Related papers (2024-06-16T03:45:03Z) - Crystalformer: Infinitely Connected Attention for Periodic Structure Encoding [10.170537065646323]
Predicting physical properties of materials from their crystal structures is a fundamental problem in materials science.
We show that crystal structures are infinitely repeating, periodic arrangements of atoms, whose fully connected attention results in infinitely connected attention.
We propose a simple yet effective Transformer-based encoder architecture for crystal structures called Crystalformer.
arXiv Detail & Related papers (2024-03-18T11:37:42Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Stoichiometry Representation Learning with Polymorphic Crystal
Structures [54.65985356122883]
Stoichiometry descriptors can reveal the ratio between elements involved to form a certain compound without any structural information.
We propose PolySRL, which learns the probabilistic representation of stoichiometry by utilizing the readily available structural information.
arXiv Detail & Related papers (2023-11-17T20:34:28Z) - Data Distillation for Neural Network Potentials toward Foundational
Dataset [6.373914211316965]
generative models can swiftly propose promising materials for targeted applications.
However, the predicted properties of materials through the generative models often do not match with calculated properties through ab initio calculations.
This study utilized extended ensemble molecular dynamics (MD) to secure a broad range of liquid- and solid-phase configurations in one of the metallic systems, nickel.
We found that the NNP trained from the distilled data could predict different energy-minimized closed-pack crystal structures even though those structures were not explicitly part of the initial data.
arXiv Detail & Related papers (2023-11-09T14:41:45Z) - Scalable Diffusion for Materials Generation [99.71001883652211]
We develop a unified crystal representation that can represent any crystal structure (UniMat)
UniMat can generate high fidelity crystal structures from larger and more complex chemical systems.
We propose additional metrics for evaluating generative models of materials.
arXiv Detail & Related papers (2023-10-18T15:49:39Z) - Geometric Transformer for End-to-End Molecule Properties Prediction [92.28929858529679]
We introduce a Transformer-based architecture for molecule property prediction, which is able to capture the geometry of the molecule.
We modify the classical positional encoder by an initial encoding of the molecule geometry, as well as a learned gated self-attention mechanism.
arXiv Detail & Related papers (2021-10-26T14:14:40Z) - A deep learning driven pseudospectral PCE based FFT homogenization
algorithm for complex microstructures [68.8204255655161]
It is shown that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
It is shown, that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
arXiv Detail & Related papers (2021-10-26T07:02:14Z) - Analogical discovery of disordered perovskite oxides by crystal
structure information hidden in unsupervised material fingerprints [1.7883499160092873]
We show that an unsupervised deep learning strategy can find fingerprints of disordered materials that embed perovskite formability and underlying crystal structure information.
This phenomenon can be capitalized to predict the crystal symmetry of experimental compositions, outperforming several supervised machine learning (ML) algorithms.
The search space of unstudied perovskites is screened from 600,000 feasible compounds using experimental data powered ML models and automated web mining tools at a 94% success rate.
arXiv Detail & Related papers (2021-05-25T12:25:53Z) - Meshless physics-informed deep learning method for three-dimensional
solid mechanics [0.0]
Deep learning and the collocation method are merged and used to solve partial differential equations describing structures' deformation.
We consider different types of materials: linear elasticity, hyperelasticity (neo-Hookean) with large deformation, and von Mises plasticity with isotropic and kinematic hardening.
arXiv Detail & Related papers (2020-12-02T21:40:37Z) - Multilinear Compressive Learning with Prior Knowledge [106.12874293597754]
Multilinear Compressive Learning (MCL) framework combines Multilinear Compressive Sensing and Machine Learning into an end-to-end system.
Key idea behind MCL is the assumption of the existence of a tensor subspace which can capture the essential features from the signal for the downstream learning task.
In this paper, we propose a novel solution to address both of the aforementioned requirements, i.e., How to find those tensor subspaces in which the signals of interest are highly separable?
arXiv Detail & Related papers (2020-02-17T19:06:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.