Image-Like Graph Representations for Improved Molecular Property
Prediction
- URL: http://arxiv.org/abs/2111.10695v1
- Date: Sat, 20 Nov 2021 22:39:11 GMT
- Title: Image-Like Graph Representations for Improved Molecular Property
Prediction
- Authors: Toni Sagayaraj, Carsten Eickhoff
- Abstract summary: We propose a new intrinsic molecular representation that bypasses the need for GNNs entirely, dubbed CubeMol.
Our fixed-dimensional representation, when paired with a transformer model, exceeds the performance of state-of-the-art GNN models and provides a path for scalability.
- Score: 7.119677737397071
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Research into deep learning models for molecular property prediction has
primarily focused on the development of better Graph Neural Network (GNN)
architectures. Though new GNN variants continue to improve performance, their
modifications share a common theme of alleviating problems intrinsic to their
fundamental graph-to-graph nature. In this work, we examine these limitations
and propose a new molecular representation that bypasses the need for GNNs
entirely, dubbed CubeMol. Our fixed-dimensional stochastic representation, when
paired with a transformer model, exceeds the performance of state-of-the-art
GNN models and provides a path for scalability.
Related papers
- A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - On the Scalability of GNNs for Molecular Graphs [7.402389334892391]
Graph Neural Networks (GNNs) are yet to show the benefits of scale due to the lower efficiency of sparse operations, large data requirements, and lack of clarity about the effectiveness of various architectures.
We analyze message-passing networks, graph Transformers, and hybrid architectures on the largest public collection of 2D molecular graphs.
For the first time, we observe that GNNs benefit tremendously from the increasing scale of depth, width, number of molecules, number of labels, and the diversity in the pretraining datasets.
arXiv Detail & Related papers (2024-04-17T17:11:31Z) - GNN-VPA: A Variance-Preserving Aggregation Strategy for Graph Neural
Networks [11.110435047801506]
We propose a variance-preserving aggregation function (VPA) that maintains expressivity, but yields improved forward and backward dynamics.
Our results could pave the way towards normalizer-free or self-normalizing GNNs.
arXiv Detail & Related papers (2024-03-07T18:52:27Z) - Will More Expressive Graph Neural Networks do Better on Generative
Tasks? [27.412913421460388]
Graph Neural Network (GNN) architectures are often underexplored.
We replace the underlying GNNs of graph generative models with more expressive GNNs.
advanced GNNs can achieve state-of-the-art results across 17 other non-GNN-based graph generative approaches.
arXiv Detail & Related papers (2023-08-23T07:57:45Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.