Incompleteness of graph convolutional neural networks for points clouds
in three dimensions
- URL: http://arxiv.org/abs/2201.07136v1
- Date: Tue, 18 Jan 2022 17:18:26 GMT
- Title: Incompleteness of graph convolutional neural networks for points clouds
in three dimensions
- Authors: Sergey N. Pozdnyakov and Michele Ceriotti
- Abstract summary: We show that even for the restricted case of graphs induced by 3D atom clouds dGCNNs are not complete.
We construct pairs of distinct point clouds that generate graphs that, for any cutoff radius, are equivalent based on a first-order Weisfeiler-Lehman test.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional neural networks (GCNN) are very popular methods in
machine learning and have been applied very successfully to the prediction of
the properties of molecules and materials. First-order GCNNs are well known to
be incomplete, i.e., there exist graphs that are distinct but appear identical
when seen through the lens of the GCNN. More complicated schemes have thus been
designed to increase their resolving power. Applications to molecules (and more
generally, point clouds), however, add a geometric dimension to the problem.
The most straightforward and prevalent approach to construct graph
representation for the molecules regards atoms as vertices in a graph and draws
a bond between each pair of atoms within a certain preselected cutoff. Bonds
can be decorated with the distance between atoms, and the resulting "distance
graph convolution NNs" (dGCNN) have empirically demonstrated excellent
resolving power and are widely used in chemical ML. Here we show that even for
the restricted case of graphs induced by 3D atom clouds dGCNNs are not
complete. We construct pairs of distinct point clouds that generate graphs
that, for any cutoff radius, are equivalent based on a first-order
Weisfeiler-Lehman test. This class of degenerate structures includes
chemically-plausible configurations, setting an ultimate limit to the
expressive power of some of the well-established GCNN architectures for
atomistic machine learning. Models that explicitly use angular information in
the description of atomic environments can resolve these degeneracies.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Convolutional Neural Networks on Manifolds: From Graphs and Back [122.06927400759021]
We propose a manifold neural network (MNN) composed of a bank of manifold convolutional filters and point-wise nonlinearities.
To sum up, we focus on the manifold model as the limit of large graphs and construct MNNs, while we can still bring back graph neural networks by the discretization of MNNs.
arXiv Detail & Related papers (2022-10-01T21:17:39Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - FunQG: Molecular Representation Learning Via Quotient Graphs [0.0]
We propose a novel molecular graph coarsening framework named FunQG.
FunQG uses Functional groups as influential building blocks of a molecule to determine its properties.
We show that the resulting informative graphs are much smaller than the molecular graphs and thus are good candidates for training GNNs.
arXiv Detail & Related papers (2022-07-18T13:36:20Z) - MolNet: A Chemically Intuitive Graph Neural Network for Prediction of
Molecular Properties [1.231476564107544]
graph neural network (GNN) has been a powerful deep-learning tool in chemistry domain.
MolNet model is chemically intuitive, accommodating the 3D non-bond information in a molecule.
MolNet gives a state-of-the-art performance in the classification task of BACE dataset and regression task of ESOL dataset.
arXiv Detail & Related papers (2022-02-01T20:47:28Z) - Molecular Graph Generation via Geometric Scattering [7.796917261490019]
Graph neural networks (GNNs) have been used extensively for addressing problems in drug design and discovery.
We propose a representation-first approach to molecular graph generation.
We show that our architecture learns meaningful representations of drug datasets and provides a platform for goal-directed drug synthesis.
arXiv Detail & Related papers (2021-10-12T18:00:23Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.