TensorNet: Cartesian Tensor Representations for Efficient Learning of
Molecular Potentials
- URL: http://arxiv.org/abs/2306.06482v2
- Date: Mon, 30 Oct 2023 09:15:00 GMT
- Title: TensorNet: Cartesian Tensor Representations for Efficient Learning of
Molecular Potentials
- Authors: Guillem Simeon, Gianni de Fabritiis
- Abstract summary: We introduceNet, an innovative O(3)-equivariant message-passing neural network architecture.
By using tensor atomic embeddings, feature mixing is simplified through matrix product operations.
The accurate prediction of vector and tensor molecular quantities on top of potential energies and forces is possible.
- Score: 4.169915659794567
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The development of efficient machine learning models for molecular systems
representation is becoming crucial in scientific research. We introduce
TensorNet, an innovative O(3)-equivariant message-passing neural network
architecture that leverages Cartesian tensor representations. By using
Cartesian tensor atomic embeddings, feature mixing is simplified through matrix
product operations. Furthermore, the cost-effective decomposition of these
tensors into rotation group irreducible representations allows for the separate
processing of scalars, vectors, and tensors when necessary. Compared to
higher-rank spherical tensor models, TensorNet demonstrates state-of-the-art
performance with significantly fewer parameters. For small molecule potential
energies, this can be achieved even with a single interaction layer. As a
result of all these properties, the model's computational cost is substantially
decreased. Moreover, the accurate prediction of vector and tensor molecular
quantities on top of potential energies and forces is possible. In summary,
TensorNet's framework opens up a new space for the design of state-of-the-art
equivariant models.
Related papers
- Higher-Rank Irreducible Cartesian Tensors for Equivariant Message Passing [23.754664894759234]
atomistic simulations are crucial for advancing the chemical sciences.
Machine-learned interatomic potentials achieve accuracy on par with ab initio and first-principles methods at a fraction of their computational cost.
arXiv Detail & Related papers (2024-05-23T07:31:20Z) - Similarity Equivariant Graph Neural Networks for Homogenization of Metamaterials [3.6443770850509423]
Soft, porous mechanical metamaterials exhibit pattern transformations that may have important applications in soft robotics, sound reduction and biomedicine.
We develop a machine learning-based approach that scales favorably to serve as a surrogate model.
We show that this network is more accurate and data-efficient than graph neural networks with fewer symmetries.
arXiv Detail & Related papers (2024-04-26T12:30:32Z) - Quantized Fourier and Polynomial Features for more Expressive Tensor
Network Models [9.18287948559108]
We exploit the tensor structure present in the features by constraining the model weights to be an underparametrized tensor network.
We show that, for the same number of model parameters, the resulting quantized models have a higher bound on the VC-dimension as opposed to their non-quantized counterparts.
arXiv Detail & Related papers (2023-09-11T13:18:19Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Edge-based Tensor prediction via graph neural networks [6.021787236982659]
Message-passing neural networks (MPNN) have shown extremely high efficiency and accuracy in predicting the physical properties of molecules and crystals.
There is currently a lack of a general MPNN framework for directly predicting the tensor properties of the crystals.
In this work, we directly designed the edge-based tensor prediction graph neural network (ETGNN) model on the basis of the invariant graph neural network to predict tensors.
arXiv Detail & Related papers (2022-01-15T06:43:15Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - Lorentz Group Equivariant Neural Network for Particle Physics [58.56031187968692]
We present a neural network architecture that is fully equivariant with respect to transformations under the Lorentz group.
For classification tasks in particle physics, we demonstrate that such an equivariant architecture leads to drastically simpler models that have relatively few learnable parameters.
arXiv Detail & Related papers (2020-06-08T17:54:43Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.