Wigner kernels: body-ordered equivariant machine learning without a
basis
- URL: http://arxiv.org/abs/2303.04124v1
- Date: Tue, 7 Mar 2023 18:34:55 GMT
- Title: Wigner kernels: body-ordered equivariant machine learning without a
basis
- Authors: Filippo Bigi and Sergey N. Pozdnyakov and Michele Ceriotti
- Abstract summary: We propose a novel density-based method which involves computing Wigner kernels''
Wigner kernels are fully equivariant and body-ordered kernels that can be computed iteratively with a cost that is independent of the radial-chemical basis.
We present several examples of the accuracy of models based on Wigner kernels in chemical applications.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine-learning models based on a point-cloud representation of a physical
object are ubiquitous in scientific applications and particularly well-suited
to the atomic-scale description of molecules and materials. Among the many
different approaches that have been pursued, the description of local atomic
environments in terms of their neighbor densities has been used widely and very
succesfully. We propose a novel density-based method which involves computing
``Wigner kernels''. These are fully equivariant and body-ordered kernels that
can be computed iteratively with a cost that is independent of the
radial-chemical basis and grows only linearly with the maximum body-order
considered. This is in marked contrast to feature-space models, which comprise
an exponentially-growing number of terms with increasing order of correlations.
We present several examples of the accuracy of models based on Wigner kernels
in chemical applications, for both scalar and tensorial targets, reaching
state-of-the-art accuracy on the popular QM9 benchmark dataset, and we discuss
the broader relevance of these ideas to equivariant geometric machine-learning.
Related papers
- Higher-Rank Irreducible Cartesian Tensors for Equivariant Message Passing [23.754664894759234]
atomistic simulations are crucial for advancing the chemical sciences.
Machine-learned interatomic potentials achieve accuracy on par with ab initio and first-principles methods at a fraction of their computational cost.
arXiv Detail & Related papers (2024-05-23T07:31:20Z) - Higher-order topological kernels via quantum computation [68.8204255655161]
Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data.
We propose a quantum approach to defining Betti kernels, which is based on constructing Betti curves with increasing order.
arXiv Detail & Related papers (2023-07-14T14:48:52Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Tensor-reduced atomic density representations [0.0]
Graph neural networks escape scaling by mapping chemical element information into a fixed dimensional space in a learnable way.
We recast this approach as tensor factorisation by exploiting the tensor structure of standard neighbour density based descriptors.
In doing so, we form compact tensor-reduced representations whose size does not depend on the number of chemical elements.
arXiv Detail & Related papers (2022-10-02T01:08:50Z) - Electronic-structure properties from atom-centered predictions of the
electron density [0.0]
electron density of a molecule or material has recently received major attention as a target quantity of machine-learning models.
We propose a gradient-based approach to minimize the loss function of the regression problem in an optimized and highly sparse feature space.
We show that starting from the predicted density a single Kohn-Sham diagonalization step can be performed to access total energy components that carry an error of just 0.1 meV/atom.
arXiv Detail & Related papers (2022-06-28T15:35:55Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Optimal radial basis for density-based atomic representations [58.720142291102135]
We discuss how to build an adaptive, optimal numerical basis that is chosen to represent most efficiently the structural diversity of the dataset at hand.
For each training dataset, this optimal basis is unique, and can be computed at no additional cost with respect to the primitive basis.
We demonstrate that this construction yields representations that are accurate and computationally efficient.
arXiv Detail & Related papers (2021-05-18T17:57:08Z) - The role of feature space in atomistic learning [62.997667081978825]
Physically-inspired descriptors play a key role in the application of machine-learning techniques to atomistic simulations.
We introduce a framework to compare different sets of descriptors, and different ways of transforming them by means of metrics and kernels.
We compare representations built in terms of n-body correlations of the atom density, quantitatively assessing the information loss associated with the use of low-order features.
arXiv Detail & Related papers (2020-09-06T14:12:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.