Directed Weight Neural Networks for Protein Structure Representation
Learning
- URL: http://arxiv.org/abs/2201.13299v1
- Date: Fri, 28 Jan 2022 13:41:56 GMT
- Title: Directed Weight Neural Networks for Protein Structure Representation
Learning
- Authors: Jiahan Li, Shitong Luo, Congyue Deng, Chaoran Cheng, Jiaqi Guan,
Leonidas Guibas, Jian Peng, Jianzhu Ma
- Abstract summary: We propose the Directed Weight Neural Network for better capturing geometric relations among different amino acids.
Our new framework supports a rich set of geometric operations on both classical and SO(3)--representation features.
It achieves state-of-the-art performance on various computational biology applications related to protein 3D structures.
- Score: 16.234990522729348
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A protein performs biological functions by folding to a particular 3D
structure. To accurately model the protein structures, both the overall
geometric topology and local fine-grained relations between amino acids (e.g.
side-chain torsion angles and inter-amino-acid orientations) should be
carefully considered. In this work, we propose the Directed Weight Neural
Network for better capturing geometric relations among different amino acids.
Extending a single weight from a scalar to a 3D directed vector, our new
framework supports a rich set of geometric operations on both classical and
SO(3)--representation features, on top of which we construct a perceptron unit
for processing amino-acid information. In addition, we introduce an equivariant
message passing paradigm on proteins for plugging the directed weight
perceptrons into existing Graph Neural Networks, showing superior versatility
in maintaining SO(3)-equivariance at the global scale. Experiments show that
our network has remarkably better expressiveness in representing geometric
relations in comparison to classical neural networks and the (globally)
equivariant networks. It also achieves state-of-the-art performance on various
computational biology applications related to protein 3D structures.
Related papers
- A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - A new perspective on building efficient and expressive 3D equivariant
graph neural networks [39.0445472718248]
We propose a hierarchy of 3D isomorphism to evaluate the expressive power of equivariant GNNs.
Our work leads to two crucial modules for designing expressive and efficient geometric GNNs.
To demonstrate the applicability of our theory, we propose LEFTNet which effectively implements these modules.
arXiv Detail & Related papers (2023-04-07T18:08:27Z) - EquiPocket: an E(3)-Equivariant Geometric Graph Neural Network for Ligand Binding Site Prediction [49.674494450107005]
Predicting the binding sites of target proteins plays a fundamental role in drug discovery.
Most existing deep-learning methods consider a protein as a 3D image by spatially clustering its atoms into voxels.
This work proposes EquiPocket, an E(3)-equivariant Graph Neural Network (GNN) for binding site prediction.
arXiv Detail & Related papers (2023-02-23T17:18:26Z) - Predicting Protein-Ligand Binding Affinity with Equivariant Line Graph
Network [22.396125176265997]
Existing approaches transform a 3D protein-ligand complex to a two-dimensional (2D) graph, and then use graph neural networks (GNNs) to predict its binding affinity.
We propose a novel Equivariant Line Graph Network (ELGN) for affinity prediction of 3D protein ligand complexes.
Experimental results on two real datasets demonstrate the effectiveness of ELGN over several state-of-the-art baselines.
arXiv Detail & Related papers (2022-10-27T02:15:52Z) - Graph Spectral Embedding using the Geodesic Betweeness Centrality [76.27138343125985]
We introduce the Graph Sylvester Embedding (GSE), an unsupervised graph representation of local similarity, connectivity, and global structure.
GSE uses the solution of the Sylvester equation to capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2022-05-07T04:11:23Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Orthogonal Graph Neural Networks [53.466187667936026]
Graph neural networks (GNNs) have received tremendous attention due to their superiority in learning node representations.
stacking more convolutional layers significantly decreases the performance of GNNs.
We propose a novel Ortho-GConv, which could generally augment the existing GNN backbones to stabilize the model training and improve the model's generalization performance.
arXiv Detail & Related papers (2021-09-23T12:39:01Z) - G-VAE, a Geometric Convolutional VAE for ProteinStructure Generation [41.66010308405784]
We introduce a joint geometric-neural networks approach for comparing, deforming and generating 3D protein structures.
Our method is able to generate plausible structures, different from the structures in the training data.
arXiv Detail & Related papers (2021-06-22T16:52:48Z) - Spherical convolutions on molecular graphs for protein model quality
assessment [0.0]
In this work, we propose Spherical Graph Convolutional Network (S-GCN) that processes 3D models of proteins represented as molecular graphs.
Within the framework of the protein model quality assessment problem, we demonstrate that the proposed spherical convolution method significantly improves the quality of model assessment.
arXiv Detail & Related papers (2020-11-16T14:22:36Z) - Learning from Protein Structure with Geometric Vector Perceptrons [6.5360079597553025]
We introduce geometric vector perceptrons, which extend standard dense layers to operate on collections of Euclidean vectors.
We demonstrate our approach on two important problems in learning from protein structure: model quality assessment and computational protein design.
arXiv Detail & Related papers (2020-09-03T01:54:25Z) - BERTology Meets Biology: Interpreting Attention in Protein Language
Models [124.8966298974842]
We demonstrate methods for analyzing protein Transformer models through the lens of attention.
We show that attention captures the folding structure of proteins, connecting amino acids that are far apart in the underlying sequence, but spatially close in the three-dimensional structure.
We also present a three-dimensional visualization of the interaction between attention and protein structure.
arXiv Detail & Related papers (2020-06-26T21:50:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.