Is Distance Matrix Enough for Geometric Deep Learning?
- URL: http://arxiv.org/abs/2302.05743v6
- Date: Sat, 19 Oct 2024 04:15:10 GMT
- Title: Is Distance Matrix Enough for Geometric Deep Learning?
- Authors: Zian Li, Xiyuan Wang, Yinan Huang, Muhan Zhang,
- Abstract summary: We show that Vanilla DisGNN is geometrically incomplete.
We then propose $k$-DisGNNs, which can effectively exploit the rich geometry contained in the distance matrix.
Our $k$-DisGNNs achieve many new state-of-the-art results on MD17.
- Score: 24.307433184938127
- License:
- Abstract: Graph Neural Networks (GNNs) are often used for tasks involving the 3D geometry of a given graph, such as molecular dynamics simulation. While incorporating Euclidean distance into Message Passing Neural Networks (referred to as Vanilla DisGNN) is a straightforward way to learn the geometry, it has been demonstrated that Vanilla DisGNN is geometrically incomplete. In this work, we first construct families of novel and symmetric geometric graphs that Vanilla DisGNN cannot distinguish even when considering all-pair distances, which greatly expands the existing counterexample families. Our counterexamples show the inherent limitation of Vanilla DisGNN to capture symmetric geometric structures. We then propose $k$-DisGNNs, which can effectively exploit the rich geometry contained in the distance matrix. We demonstrate the high expressive power of $k$-DisGNNs from three perspectives: 1. They can learn high-order geometric information that cannot be captured by Vanilla DisGNN. 2. They can unify some existing well-designed geometric models. 3. They are universal function approximators from geometric graphs to scalars (when $k\geq 2$) and vectors (when $k\geq 3$). Most importantly, we establish a connection between geometric deep learning (GDL) and traditional graph representation learning (GRL), showing that those highly expressive GNN models originally designed for GRL can also be applied to GDL with impressive performance, and that existing complicated, equivariant models are not the only solution. Experiments verify our theory. Our $k$-DisGNNs achieve many new state-of-the-art results on MD17.
Related papers
- On the Expressive Power of Sparse Geometric MPNNs [3.396731589928944]
We study the expressive power of message-passing neural networks for geometric graphs.
We show that generic pairs of non-isomorphic geometric graphs can be separated by message-passing networks.
arXiv Detail & Related papers (2024-07-02T07:48:22Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - On the Completeness of Invariant Geometric Deep Learning Models [22.43250261702209]
Invariant models are capable of generating meaningful geometric representations by leveraging informative geometric features in point clouds.
We show that GeoNGNN, the geometric counterpart of one of the simplest subgraph graph neural networks (subgraph GNNs), can effectively break these corner cases' symmetry.
By leveraging GeoNGNN as a theoretical tool, we further prove that: 1) most subgraph GNNs developed in traditional graph learning can be seamlessly extended to geometric scenarios with E(3)-completeness.
arXiv Detail & Related papers (2024-02-07T13:32:53Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Graph Neural Networks and 3-Dimensional Topology [0.0]
We consider the class of 3-manifolds described by plumbing graphs and use Graph Neural Networks (GNN) for the problem.
We use supervised learning to train a GNN that provides the answer to such a question with high accuracy.
We consider reinforcement learning by a GNN to find a sequence of Neumann moves that relates the pair of graphs if the answer is positive.
arXiv Detail & Related papers (2023-05-10T08:18:10Z) - 3D Molecular Geometry Analysis with 2D Graphs [79.47097907673877]
Ground-state 3D geometries of molecules are essential for many molecular analysis tasks.
Modern quantum mechanical methods can compute accurate 3D geometries but are computationally prohibitive.
We propose a novel deep learning framework to predict 3D geometries from molecular graphs.
arXiv Detail & Related papers (2023-05-01T19:00:46Z) - On the Expressive Power of Geometric Graph Neural Networks [18.569063436109044]
We propose a geometric version of the WL test (GWL) for discriminating geometric graphs while respecting the underlying physical symmetries.
We unpack how key design choices influence geometric GNN expressivity.
arXiv Detail & Related papers (2023-01-23T08:08:10Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - MGNN: Graph Neural Networks Inspired by Distance Geometry Problem [28.789684784093048]
Graph Neural Networks (GNNs) have emerged as a prominent research topic in the field of machine learning.
In this paper, we propose a GNN model inspired by the congruent-inphilic property of the classifiers in the classification phase of GNNs.
We extensively evaluate the effectiveness of our model through experiments conducted on both synthetic and real-world datasets.
arXiv Detail & Related papers (2022-01-31T04:15:42Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.