Edge-based Tensor prediction via graph neural networks
- URL: http://arxiv.org/abs/2201.05770v1
- Date: Sat, 15 Jan 2022 06:43:15 GMT
- Title: Edge-based Tensor prediction via graph neural networks
- Authors: Yang Zhong, Hongyu Yu, Xingao Gong, Hongjun Xiang
- Abstract summary: Message-passing neural networks (MPNN) have shown extremely high efficiency and accuracy in predicting the physical properties of molecules and crystals.
There is currently a lack of a general MPNN framework for directly predicting the tensor properties of the crystals.
In this work, we directly designed the edge-based tensor prediction graph neural network (ETGNN) model on the basis of the invariant graph neural network to predict tensors.
- Score: 6.021787236982659
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Message-passing neural networks (MPNN) have shown extremely high efficiency
and accuracy in predicting the physical properties of molecules and crystals,
and are expected to become the next-generation material simulation tool after
the density functional theory (DFT). However, there is currently a lack of a
general MPNN framework for directly predicting the tensor properties of the
crystals. In this work, a general framework for the prediction of tensor
properties was proposed: the tensor property of a crystal can be decomposed
into the average of the tensor contributions of all the atoms in the crystal,
and the tensor contribution of each atom can be expanded as the sum of the
tensor projections in the directions of the edges connecting the atoms. On this
basis, the edge-based expansions of force vectors, Born effective charges
(BECs), dielectric (DL) and piezoelectric (PZ) tensors were proposed. These
expansions are rotationally equivariant, while the coefficients in these tensor
expansions are rotationally invariant scalars which are similar to physical
quantities such as formation energy and band gap. The advantage of this tensor
prediction framework is that it does not require the network itself to be
equivariant. Therefore, in this work, we directly designed the edge-based
tensor prediction graph neural network (ETGNN) model on the basis of the
invariant graph neural network to predict tensors. The validity and high
precision of this tensor prediction framework were shown by the tests of ETGNN
on the extended systems, random perturbed structures and JARVIS-DFT datasets.
This tensor prediction framework is general for nearly all the GNNs and can
achieve higher accuracy with more advanced GNNs in the future.
Related papers
- A Space Group Symmetry Informed Network for O(3) Equivariant Crystal Tensor Prediction [89.38877696273364]
We consider the prediction of general tensor properties of crystalline materials.
We propose a General Materials Network (GMTNet), which is carefully designed to satisfy the required symmetries.
Experimental results show that our GMTNet achieves promising performance on crystal tensors of various orders.
arXiv Detail & Related papers (2024-06-03T16:26:16Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - StrainTensorNet: Predicting crystal structure elastic properties using
SE(3)-equivariant graph neural networks [1.9260081982051918]
We introduce a novel data-driven approach to efficiently predict the elastic properties of crystal structures.
This approach yields important scalar elastic moduli with the accuracy comparable to recent data-driven studies.
arXiv Detail & Related papers (2023-06-22T11:34:08Z) - TensorNet: Cartesian Tensor Representations for Efficient Learning of
Molecular Potentials [4.169915659794567]
We introduceNet, an innovative O(3)-equivariant message-passing neural network architecture.
By using tensor atomic embeddings, feature mixing is simplified through matrix product operations.
The accurate prediction of vector and tensor molecular quantities on top of potential energies and forces is possible.
arXiv Detail & Related papers (2023-06-10T16:41:18Z) - Isometric tensor network optimization for extensive Hamiltonians is free
of barren plateaus [0.0]
We show that there are no barren plateaus in the energy optimization of isometric tensor network states (TNS)
TNS are a promising route for an efficient quantum-computation-based investigation of strongly-correlated quantum matter.
arXiv Detail & Related papers (2023-04-27T16:45:57Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Equivariant message passing for the prediction of tensorial properties
and molecular spectra [1.7188280334580197]
We propose the polarizable atom interaction neural network (PaiNN) and improve on common molecule benchmarks over previous networks.
We apply this to the simulation of molecular spectra, achieving speedups of 4-5 orders of magnitude compared to the electronic structure reference.
arXiv Detail & Related papers (2021-02-05T13:00:12Z) - Tensor-Train Networks for Learning Predictive Modeling of
Multidimensional Data [0.0]
A promising strategy is based on tensor networks, which have been very successful in physical and chemical applications.
We show that the weights of a multidimensional regression model can be learned by means of tensor networks with the aim of performing a powerful compact representation.
An algorithm based on alternating least squares has been proposed for approximating the weights in TT-format with a reduction of computational power.
arXiv Detail & Related papers (2021-01-22T16:14:38Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z) - Lorentz Group Equivariant Neural Network for Particle Physics [58.56031187968692]
We present a neural network architecture that is fully equivariant with respect to transformations under the Lorentz group.
For classification tasks in particle physics, we demonstrate that such an equivariant architecture leads to drastically simpler models that have relatively few learnable parameters.
arXiv Detail & Related papers (2020-06-08T17:54:43Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.