Higher-Order Message Passing for Glycan Representation Learning
- URL: http://arxiv.org/abs/2409.13467v2
- Date: Tue, 22 Oct 2024 11:32:23 GMT
- Title: Higher-Order Message Passing for Glycan Representation Learning
- Authors: Roman Joeres, Daniel Bojar,
- Abstract summary: Graph Networks (GNNs) are deep learning models designed to process and analyze graph-structured data.
This work presents a new model architecture based on complexes and higher-order message passing to extract features from glycan structures into latent space representation.
We envision that these improvements will spur further advances in computational glycosciences and reveal the roles of glycans in biology.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Glycans are the most complex biological sequence, with monosaccharides forming extended, non-linear sequences. As post-translational modifications, they modulate protein structure, function, and interactions. Due to their diversity and complexity, predictive models of glycan properties and functions are still insufficient. Graph Neural Networks (GNNs) are deep learning models designed to process and analyze graph-structured data. These architectures leverage the connectivity and relational information in graphs to learn effective representations of nodes, edges, and entire graphs. Iteratively aggregating information from neighboring nodes, GNNs capture complex patterns within graph data, making them particularly well-suited for tasks such as link prediction or graph classification across domains. This work presents a new model architecture based on combinatorial complexes and higher-order message passing to extract features from glycan structures into a latent space representation. The architecture is evaluated on an improved GlycanML benchmark suite, establishing a new state-of-the-art performance. We envision that these improvements will spur further advances in computational glycosciences and reveal the roles of glycans in biology.
Related papers
- Learning From Graph-Structured Data: Addressing Design Issues and Exploring Practical Applications in Graph Representation Learning [2.492884361833709]
We present an exhaustive review of the latest advancements in graph representation learning and Graph Neural Networks (GNNs)
GNNs, tailored to handle graph-structured data, excel in deriving insights and predictions from intricate relational information.
Our work delves into the capabilities of GNNs, examining their foundational designs and their application in addressing real-world challenges.
arXiv Detail & Related papers (2024-11-09T19:10:33Z) - Self-Supervised Graph Neural Networks for Enhanced Feature Extraction in Heterogeneous Information Networks [16.12856816023414]
This paper explores the applications and challenges of graph neural networks (GNNs) in processing complex graph data brought about by the rapid development of the Internet.
By introducing a self-supervisory mechanism, it is expected to improve the adaptability of existing models to the diversity and complexity of graph data.
arXiv Detail & Related papers (2024-10-23T07:14:37Z) - Harnessing Collective Structure Knowledge in Data Augmentation for Graph Neural Networks [25.12261412297796]
Graph neural networks (GNNs) have achieved state-of-the-art performance in graph representation learning.
We propose a novel approach, namely collective structure knowledge-augmented graph neural network (CoS-GNN)
arXiv Detail & Related papers (2024-05-17T08:50:00Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Structural Landmarking and Interaction Modelling: on Resolution Dilemmas
in Graph Classification [50.83222170524406]
We study the intrinsic difficulty in graph classification under the unified concept of resolution dilemmas''
We propose SLIM'', an inductive neural network model for Structural Landmarking and Interaction Modelling.
arXiv Detail & Related papers (2020-06-29T01:01:42Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.