A new perspective on building efficient and expressive 3D equivariant
graph neural networks
- URL: http://arxiv.org/abs/2304.04757v1
- Date: Fri, 7 Apr 2023 18:08:27 GMT
- Title: A new perspective on building efficient and expressive 3D equivariant
graph neural networks
- Authors: Weitao Du, Yuanqi Du, Limei Wang, Dieqiao Feng, Guifeng Wang, Shuiwang
Ji, Carla Gomes, Zhi-Ming Ma
- Abstract summary: We propose a hierarchy of 3D isomorphism to evaluate the expressive power of equivariant GNNs.
Our work leads to two crucial modules for designing expressive and efficient geometric GNNs.
To demonstrate the applicability of our theory, we propose LEFTNet which effectively implements these modules.
- Score: 39.0445472718248
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Geometric deep learning enables the encoding of physical symmetries in
modeling 3D objects. Despite rapid progress in encoding 3D symmetries into
Graph Neural Networks (GNNs), a comprehensive evaluation of the expressiveness
of these networks through a local-to-global analysis lacks today. In this
paper, we propose a local hierarchy of 3D isomorphism to evaluate the
expressive power of equivariant GNNs and investigate the process of
representing global geometric information from local patches. Our work leads to
two crucial modules for designing expressive and efficient geometric GNNs;
namely local substructure encoding (LSE) and frame transition encoding (FTE).
To demonstrate the applicability of our theory, we propose LEFTNet which
effectively implements these modules and achieves state-of-the-art performance
on both scalar-valued and vector-valued molecular property prediction tasks. We
further point out the design space for future developments of equivariant graph
neural networks. Our codes are available at
\url{https://github.com/yuanqidu/LeftNet}.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - A quatum inspired neural network for geometric modeling [14.214656118952178]
We introduce an innovative equivariant Matrix Product State (MPS)-based message-passing strategy.
Our method effectively models complex many-body relationships, suppressing mean-field approximations.
It seamlessly replaces the standard message-passing and layer-aggregation modules intrinsic to geometric GNNs.
arXiv Detail & Related papers (2024-01-03T15:59:35Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Torsion Graph Neural Networks [21.965704710488232]
We propose TorGNN, an analytic torsion enhanced Graph Neural Network model.
In our TorGNN, for each edge, a corresponding local simplicial complex is identified, then the analytic torsion is calculated.
It has been found that our TorGNN can achieve superior performance on both tasks, and outperform various state-of-the-art models.
arXiv Detail & Related papers (2023-06-23T15:02:23Z) - Dense Graph Convolutional Neural Networks on 3D Meshes for 3D Object
Segmentation and Classification [0.0]
We present new designs of graph convolutional neural networks (GCNs) on 3D meshes for 3D object classification and segmentation.
We use the faces of the mesh as basic processing units and represent a 3D mesh as a graph where each node corresponds to a face.
arXiv Detail & Related papers (2021-06-30T02:17:16Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - Learning Local Neighboring Structure for Robust 3D Shape Representation [143.15904669246697]
Representation learning for 3D meshes is important in many computer vision and graphics applications.
We propose a local structure-aware anisotropic convolutional operation (LSA-Conv)
Our model produces significant improvement in 3D shape reconstruction compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-04-21T13:40:03Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.