Multiresolution Graph Transformers and Wavelet Positional Encoding for
Learning Hierarchical Structures
- URL: http://arxiv.org/abs/2302.08647v4
- Date: Fri, 21 Jul 2023 14:59:16 GMT
- Title: Multiresolution Graph Transformers and Wavelet Positional Encoding for
Learning Hierarchical Structures
- Authors: Nhat Khang Ngo, Truong Son Hy, Risi Kondor
- Abstract summary: We propose Multiresolution Graph Transformers (MGT), the first graph transformer architecture that can learn to represent large molecules at multiple scales.
MGT can learn to produce representations for the atoms and group them into meaningful functional groups or repeating units.
Our proposed model achieves results on two macromolecule datasets consisting of polymers and peptides, and one drug-like molecule dataset.
- Score: 6.875312133832078
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Contemporary graph learning algorithms are not well-defined for large
molecules since they do not consider the hierarchical interactions among the
atoms, which are essential to determine the molecular properties of
macromolecules. In this work, we propose Multiresolution Graph Transformers
(MGT), the first graph transformer architecture that can learn to represent
large molecules at multiple scales. MGT can learn to produce representations
for the atoms and group them into meaningful functional groups or repeating
units. We also introduce Wavelet Positional Encoding (WavePE), a new positional
encoding method that can guarantee localization in both spectral and spatial
domains. Our proposed model achieves competitive results on two macromolecule
datasets consisting of polymers and peptides, and one drug-like molecule
dataset. Importantly, our model outperforms other state-of-the-art methods and
achieves chemical accuracy in estimating molecular properties (e.g., GAP, HOMO
and LUMO) calculated by Density Functional Theory (DFT) in the polymers
dataset. Furthermore, the visualizations, including clustering results on
macromolecules and low-dimensional spaces of their representations, demonstrate
the capability of our methodology in learning to represent long-range and
hierarchical structures. Our PyTorch implementation is publicly available at
https://github.com/HySonLab/Multires-Graph-Transformer
Related papers
- Molecular topological deep learning for polymer property prediction [18.602659324026934]
We develop molecular topological deep learning (Mol-TDL) for polymer property analysis.
Mol-TDL incorporates both high-order interactions and multiscale properties into topological deep learning architecture.
arXiv Detail & Related papers (2024-10-07T05:44:02Z) - Molecular Property Prediction Based on Graph Structure Learning [29.516479802217205]
We propose a graph structure learning (GSL) based MPP approach, called GSL-MPP.
Specifically, we first apply graph neural network (GNN) over molecular graphs to extract molecular representations.
With molecular fingerprints, we construct a molecular similarity graph (MSG)
arXiv Detail & Related papers (2023-12-28T06:45:13Z) - Geometry-aware Line Graph Transformer Pre-training for Molecular
Property Prediction [4.598522704308923]
Geometry-aware line graph transformer (Galformer) pre-training is a novel self-supervised learning framework.
Galformer consistently outperforms all baselines on both classification and regression tasks.
arXiv Detail & Related papers (2023-09-01T14:20:48Z) - Bi-level Contrastive Learning for Knowledge-Enhanced Molecule
Representations [55.42602325017405]
We propose a novel method called GODE, which takes into account the two-level structure of individual molecules.
By pre-training two graph neural networks (GNNs) on different graph structures, combined with contrastive learning, GODE fuses molecular structures with their corresponding knowledge graph substructures.
When fine-tuned across 11 chemical property tasks, our model outperforms existing benchmarks, registering an average ROC-AUC uplift of 13.8% for classification tasks and an average RMSE/MAE enhancement of 35.1% for regression tasks.
arXiv Detail & Related papers (2023-06-02T15:49:45Z) - Atomic and Subgraph-aware Bilateral Aggregation for Molecular
Representation Learning [57.670845619155195]
We introduce a new model for molecular representation learning called the Atomic and Subgraph-aware Bilateral Aggregation (ASBA)
ASBA addresses the limitations of previous atom-wise and subgraph-wise models by incorporating both types of information.
Our method offers a more comprehensive way to learn representations for molecular property prediction and has broad potential in drug and material discovery applications.
arXiv Detail & Related papers (2023-05-22T00:56:00Z) - MUDiff: Unified Diffusion for Complete Molecule Generation [104.7021929437504]
We present a new model for generating a comprehensive representation of molecules, including atom features, 2D discrete molecule structures, and 3D continuous molecule coordinates.
We propose a novel graph transformer architecture to denoise the diffusion process.
Our model is a promising approach for designing stable and diverse molecules and can be applied to a wide range of tasks in molecular modeling.
arXiv Detail & Related papers (2023-04-28T04:25:57Z) - Learning Harmonic Molecular Representations on Riemannian Manifold [18.49126496517951]
Molecular representation learning plays a crucial role in AI-assisted drug discovery research.
We propose a Harmonic Molecular Representation learning framework, which represents a molecule using the Laplace-Beltrami eigenfunctions of its molecular surface.
arXiv Detail & Related papers (2023-03-27T18:02:47Z) - Molecular Geometry-aware Transformer for accurate 3D Atomic System
modeling [51.83761266429285]
We propose a novel Transformer architecture that takes nodes (atoms) and edges (bonds and nonbonding atom pairs) as inputs and models the interactions among them.
Moleformer achieves state-of-the-art on the initial state to relaxed energy prediction of OC20 and is very competitive in QM9 on predicting quantum chemical properties.
arXiv Detail & Related papers (2023-02-02T03:49:57Z) - Graph-based Molecular Representation Learning [59.06193431883431]
Molecular representation learning (MRL) is a key step to build the connection between machine learning and chemical science.
Recently, MRL has achieved considerable progress, especially in methods based on deep molecular graph learning.
arXiv Detail & Related papers (2022-07-08T17:43:20Z) - Geometric Transformer for End-to-End Molecule Properties Prediction [92.28929858529679]
We introduce a Transformer-based architecture for molecule property prediction, which is able to capture the geometry of the molecule.
We modify the classical positional encoder by an initial encoding of the molecule geometry, as well as a learned gated self-attention mechanism.
arXiv Detail & Related papers (2021-10-26T14:14:40Z) - Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for
Molecular Structures [20.276492931562036]
A growing number of Graph Neural Networks (GNNs) have been proposed to address this challenge.
In this work, we aim to design a GNN which is both powerful and efficient for molecule structures.
We build Multiplex Molecular Graph Neural Network (MXMNet)
arXiv Detail & Related papers (2020-11-15T05:55:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.