A quatum inspired neural network for geometric modeling
- URL: http://arxiv.org/abs/2401.01801v2
- Date: Sun, 28 Jan 2024 16:13:37 GMT
- Title: A quatum inspired neural network for geometric modeling
- Authors: Weitao Du, Shengchao Liu, Xuecang Zhang
- Abstract summary: We introduce an innovative equivariant Matrix Product State (MPS)-based message-passing strategy.
Our method effectively models complex many-body relationships, suppressing mean-field approximations.
It seamlessly replaces the standard message-passing and layer-aggregation modules intrinsic to geometric GNNs.
- Score: 14.214656118952178
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: By conceiving physical systems as 3D many-body point clouds, geometric graph
neural networks (GNNs), such as SE(3)/E(3) equivalent GNNs, have showcased
promising performance. In particular, their effective message-passing mechanics
make them adept at modeling molecules and crystalline materials. However,
current geometric GNNs only offer a mean-field approximation of the many-body
system, encapsulated within two-body message passing, thus falling short in
capturing intricate relationships within these geometric graphs. To address
this limitation, tensor networks, widely employed by computational physics to
handle manybody systems using high-order tensors, have been introduced.
Nevertheless, integrating these tensorized networks into the message-passing
framework of GNNs faces scalability and symmetry conservation (e.g.,
permutation and rotation) challenges. In response, we introduce an innovative
equivariant Matrix Product State (MPS)-based message-passing strategy, through
achieving an efficient implementation of the tensor contraction operation. Our
method effectively models complex many-body relationships, suppressing
mean-field approximations, and captures symmetries within geometric graphs.
Importantly, it seamlessly replaces the standard message-passing and
layer-aggregation modules intrinsic to geometric GNNs. We empirically validate
the superior accuracy of our approach on benchmark tasks, including predicting
classical Newton systems and quantum tensor Hamiltonian matrices. To our
knowledge, our approach represents the inaugural utilization of parameterized
geometric tensor networks.
Related papers
- Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Neural P$^3$M: A Long-Range Interaction Modeling Enhancer for Geometric
GNNs [66.98487644676906]
We introduce Neural P$3$M, a versatile enhancer of geometric GNNs to expand the scope of their capabilities.
It exhibits flexibility across a wide range of molecular systems and demonstrates remarkable accuracy in predicting energies and forces.
It also achieves an average improvement of 22% on the OE62 dataset while integrating with various architectures.
arXiv Detail & Related papers (2024-09-26T08:16:59Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - On the Completeness of Invariant Geometric Deep Learning Models [22.43250261702209]
Invariant models are capable of generating meaningful geometric representations by leveraging informative geometric features in point clouds.
We show that GeoNGNN, the geometric counterpart of one of the simplest subgraph graph neural networks (subgraph GNNs), can effectively break these corner cases' symmetry.
By leveraging GeoNGNN as a theoretical tool, we further prove that: 1) most subgraph GNNs developed in traditional graph learning can be seamlessly extended to geometric scenarios with E(3)-completeness.
arXiv Detail & Related papers (2024-02-07T13:32:53Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Torsion Graph Neural Networks [21.965704710488232]
We propose TorGNN, an analytic torsion enhanced Graph Neural Network model.
In our TorGNN, for each edge, a corresponding local simplicial complex is identified, then the analytic torsion is calculated.
It has been found that our TorGNN can achieve superior performance on both tasks, and outperform various state-of-the-art models.
arXiv Detail & Related papers (2023-06-23T15:02:23Z) - A new perspective on building efficient and expressive 3D equivariant
graph neural networks [39.0445472718248]
We propose a hierarchy of 3D isomorphism to evaluate the expressive power of equivariant GNNs.
Our work leads to two crucial modules for designing expressive and efficient geometric GNNs.
To demonstrate the applicability of our theory, we propose LEFTNet which effectively implements these modules.
arXiv Detail & Related papers (2023-04-07T18:08:27Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - Tensor-Train Networks for Learning Predictive Modeling of
Multidimensional Data [0.0]
A promising strategy is based on tensor networks, which have been very successful in physical and chemical applications.
We show that the weights of a multidimensional regression model can be learned by means of tensor networks with the aim of performing a powerful compact representation.
An algorithm based on alternating least squares has been proposed for approximating the weights in TT-format with a reduction of computational power.
arXiv Detail & Related papers (2021-01-22T16:14:38Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.