On Representing Electronic Wave Functions with Sign Equivariant Neural
Networks
- URL: http://arxiv.org/abs/2403.05249v1
- Date: Fri, 8 Mar 2024 12:13:11 GMT
- Title: On Representing Electronic Wave Functions with Sign Equivariant Neural
Networks
- Authors: Nicholas Gao, Stephan G\"unnemann
- Abstract summary: Recent neural networks demonstrated impressively accurate approximations of electronic ground-state wave functions.
These neural networks typically consist of a permutation-equivariant neural network followed by a permutation-antisymmetric operation.
While accurate, such neural networks are computationally expensive.
- Score: 10.80375466357108
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent neural networks demonstrated impressively accurate approximations of
electronic ground-state wave functions. Such neural networks typically consist
of a permutation-equivariant neural network followed by a
permutation-antisymmetric operation to enforce the electronic exchange
symmetry. While accurate, such neural networks are computationally expensive.
In this work, we explore the flipped approach, where we first compute
antisymmetric quantities based on the electronic coordinates and then apply
sign equivariant neural networks to preserve the antisymmetry. While this
approach promises acceleration thanks to the lower-dimensional representation,
we demonstrate that it reduces to a Jastrow factor, a commonly used
permutation-invariant multiplicative factor in the wave function. Our empirical
results support this further, finding little to no improvements over baselines.
We conclude with neither theoretical nor empirical advantages of sign
equivariant functions for representing electronic wave functions within the
evaluation of this work.
Related papers
- Equivariant neural networks and piecewise linear representation theory [0.0]
Equivariant neural networks are neural networks with symmetry.
Motivated by the theory of group representations, we decompose the layers of an equivariant neural network into simple representations.
arXiv Detail & Related papers (2024-08-01T23:08:37Z) - Neural Pfaffians: Solving Many Many-Electron Schrödinger Equations [58.130170155147205]
Neural wave functions accomplished unprecedented accuracies in approximating the ground state of many-electron systems, though at a high computational cost.
Recent works proposed amortizing the cost by learning generalized wave functions across different structures and compounds instead of solving each problem independently.
This work tackles the problem by defining overparametrized, fully learnable neural wave functions suitable for generalization across molecules.
arXiv Detail & Related papers (2024-05-23T16:30:51Z) - Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - A Lifted Bregman Formulation for the Inversion of Deep Neural Networks [28.03724379169264]
We propose a novel framework for the regularised inversion of deep neural networks.
The framework lifts the parameter space into a higher dimensional space by introducing auxiliary variables.
We present theoretical results and support their practical application with numerical examples.
arXiv Detail & Related papers (2023-03-01T20:30:22Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Determinant-free fermionic wave function using feed-forward neural
networks [0.0]
We propose a framework for finding the ground state of many-body fermionic systems by using feed-forward neural networks.
We show that the accuracy of the approximation can be improved by optimizing the "variance" of the energy simultaneously with the energy itself.
These improvements can be applied to other approaches based on variational Monte Carlo methods.
arXiv Detail & Related papers (2021-08-19T11:51:36Z) - Training End-to-End Analog Neural Networks with Equilibrium Propagation [64.0476282000118]
We introduce a principled method to train end-to-end analog neural networks by gradient descent.
We show mathematically that a class of analog neural networks (called nonlinear resistive networks) are energy-based models.
Our work can guide the development of a new generation of ultra-fast, compact and low-power neural networks supporting on-chip learning.
arXiv Detail & Related papers (2020-06-02T23:38:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.