Tensor-reduced atomic density representations
- URL: http://arxiv.org/abs/2210.01705v1
- Date: Sun, 2 Oct 2022 01:08:50 GMT
- Title: Tensor-reduced atomic density representations
- Authors: James P. Darby, D\'avid P. Kov\'acs, Ilyes Batatia, Miguel A. Caro,
Gus L. W. Hart, Christoph Ortner and G\'abor Cs\'anyi
- Abstract summary: Graph neural networks escape scaling by mapping chemical element information into a fixed dimensional space in a learnable way.
We recast this approach as tensor factorisation by exploiting the tensor structure of standard neighbour density based descriptors.
In doing so, we form compact tensor-reduced representations whose size does not depend on the number of chemical elements.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Density based representations of atomic environments that are invariant under
Euclidean symmetries have become a widely used tool in the machine learning of
interatomic potentials, broader data-driven atomistic modelling and the
visualisation and analysis of materials datasets.The standard mechanism used to
incorporate chemical element information is to create separate densities for
each element and form tensor products between them. This leads to a steep
scaling in the size of the representation as the number of elements increases.
Graph neural networks, which do not explicitly use density representations,
escape this scaling by mapping the chemical element information into a fixed
dimensional space in a learnable way. We recast this approach as tensor
factorisation by exploiting the tensor structure of standard neighbour density
based descriptors. In doing so, we form compact tensor-reduced representations
whose size does not depend on the number of chemical elements, but remain
systematically convergeable and are therefore applicable to a wide range of
data analysis and regression tasks.
Related papers
- Similarity Equivariant Graph Neural Networks for Homogenization of Metamaterials [3.6443770850509423]
Soft, porous mechanical metamaterials exhibit pattern transformations that may have important applications in soft robotics, sound reduction and biomedicine.
We develop a machine learning-based approach that scales favorably to serve as a surrogate model.
We show that this network is more accurate and data-efficient than graph neural networks with fewer symmetries.
arXiv Detail & Related papers (2024-04-26T12:30:32Z) - Cartesian atomic cluster expansion for machine learning interatomic potentials [0.0]
Machine learning interatomic potentials are revolutionizing atomistic modelling in material science and chemistry.
We propose a Cartesian-coordinates-based atomic density expansion that exhibits good accuracy, stability, and generalizability.
We validate its performance in diverse systems, including bulk water, small molecules, and 25-element high-entropy alloys.
arXiv Detail & Related papers (2024-02-12T08:17:23Z) - Datacube segmentation via Deep Spectral Clustering [76.48544221010424]
Extended Vision techniques often pose a challenge in their interpretation.
The huge dimensionality of data cube spectra poses a complex task in its statistical interpretation.
In this paper, we explore the possibility of applying unsupervised clustering methods in encoded space.
A statistical dimensional reduction is performed by an ad hoc trained (Variational) AutoEncoder, while the clustering process is performed by a (learnable) iterative K-Means clustering algorithm.
arXiv Detail & Related papers (2024-01-31T09:31:28Z) - Towards Symmetry-Aware Generation of Periodic Materials [64.21777911715267]
We propose SyMat, a novel material generation approach that can capture physical symmetries of periodic material structures.
SyMat generates atom types and lattices of materials through generating atom type sets, lattice lengths and lattice angles with a variational auto-encoder model.
We show that SyMat is theoretically invariant to all symmetry transformations on materials and demonstrate that SyMat achieves promising performance on random generation and property optimization tasks.
arXiv Detail & Related papers (2023-07-06T01:05:34Z) - TensorNet: Cartesian Tensor Representations for Efficient Learning of
Molecular Potentials [4.169915659794567]
We introduceNet, an innovative O(3)-equivariant message-passing neural network architecture.
By using tensor atomic embeddings, feature mixing is simplified through matrix product operations.
The accurate prediction of vector and tensor molecular quantities on top of potential energies and forces is possible.
arXiv Detail & Related papers (2023-06-10T16:41:18Z) - Atomic and Subgraph-aware Bilateral Aggregation for Molecular
Representation Learning [57.670845619155195]
We introduce a new model for molecular representation learning called the Atomic and Subgraph-aware Bilateral Aggregation (ASBA)
ASBA addresses the limitations of previous atom-wise and subgraph-wise models by incorporating both types of information.
Our method offers a more comprehensive way to learn representations for molecular property prediction and has broad potential in drug and material discovery applications.
arXiv Detail & Related papers (2023-05-22T00:56:00Z) - Wigner kernels: body-ordered equivariant machine learning without a
basis [0.0]
We propose a novel density-based method which involves computing Wigner kernels''
Wigner kernels are fully equivariant and body-ordered kernels that can be computed iteratively with a cost that is independent of the radial-chemical basis.
We present several examples of the accuracy of models based on Wigner kernels in chemical applications.
arXiv Detail & Related papers (2023-03-07T18:34:55Z) - Optimal radial basis for density-based atomic representations [58.720142291102135]
We discuss how to build an adaptive, optimal numerical basis that is chosen to represent most efficiently the structural diversity of the dataset at hand.
For each training dataset, this optimal basis is unique, and can be computed at no additional cost with respect to the primitive basis.
We demonstrate that this construction yields representations that are accurate and computationally efficient.
arXiv Detail & Related papers (2021-05-18T17:57:08Z) - The role of feature space in atomistic learning [62.997667081978825]
Physically-inspired descriptors play a key role in the application of machine-learning techniques to atomistic simulations.
We introduce a framework to compare different sets of descriptors, and different ways of transforming them by means of metrics and kernels.
We compare representations built in terms of n-body correlations of the atom density, quantitatively assessing the information loss associated with the use of low-order features.
arXiv Detail & Related papers (2020-09-06T14:12:09Z) - Graph Neural Network for Hamiltonian-Based Material Property Prediction [56.94118357003096]
We present and compare several different graph convolution networks that are able to predict the band gap for inorganic materials.
The models are developed to incorporate two different features: the information of each orbital itself and the interaction between each other.
The results show that our model can get a promising prediction accuracy with cross-validation.
arXiv Detail & Related papers (2020-05-27T13:32:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.