A Novel Graph Transformer Framework for Gene Regulatory Network Inference
- URL: http://arxiv.org/abs/2504.16961v1
- Date: Wed, 23 Apr 2025 06:24:26 GMT
- Title: A Novel Graph Transformer Framework for Gene Regulatory Network Inference
- Authors: Binon Teji, Swarup Roy,
- Abstract summary: Inference of gene regulatory networks (GRNs) may not always reflect true biological interactions.<n>Most GRN inference methods face several challenges in the network reconstruction phase.<n>We employ autoencoder embeddings to capture gene expression patterns directly from raw data.<n>We embed the prior knowledge from GRN structures transforming them into a text-like representation.
- Score: 0.27624021966289597
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The inference of gene regulatory networks (GRNs) is a foundational stride towards deciphering the fundamentals of complex biological systems. Inferring a possible regulatory link between two genes can be formulated as a link prediction problem. Inference of GRNs via gene coexpression profiling data may not always reflect true biological interactions, as its susceptibility to noise and misrepresenting true biological regulatory relationships. Most GRN inference methods face several challenges in the network reconstruction phase. Therefore, it is important to encode gene expression values, leverege the prior knowledge gained from the available inferred network structures and positional informations of the input network nodes towards inferring a better and more confident GRN network reconstruction. In this paper, we explore the integration of multiple inferred networks to enhance the inference of Gene Regulatory Networks (GRNs). Primarily, we employ autoencoder embeddings to capture gene expression patterns directly from raw data, preserving intricate biological signals. Then, we embed the prior knowledge from GRN structures transforming them into a text-like representation using random walks, which are then encoded with a masked language model, BERT, to generate global embeddings for each gene across all networks. Additionally, we embed the positional encodings of the input gene networks to better identify the position of each unique gene within the graph. These embeddings are integrated into graph transformer-based model, termed GT-GRN, for GRN inference. The GT-GRN model effectively utilizes the topological structure of the ground truth network while incorporating the enriched encoded information. Experimental results demonstrate that GT-GRN significantly outperforms existing GRN inference methods, achieving superior accuracy and highlighting the robustness of our approach.
Related papers
- GRNFormer: A Biologically-Guided Framework for Integrating Gene Regulatory Networks into RNA Foundation Models [39.58414436685698]
We propose a new framework that integrates multi-scale Gene Regulatory Networks (GRNs) inferred from multi-omics data into RNA foundation model training.<n>GRNFormer achieves consistent improvements over state-of-the-art (SoTA) baselines.
arXiv Detail & Related papers (2025-03-03T15:56:39Z) - Cross-Attention Graph Neural Networks for Inferring Gene Regulatory Networks with Skewed Degree Distribution [9.919024883502322]
Cross-Attention Complex Dual Graph Embedding Model (XATGRN)<n>Our model consistently outperforms existing state-of-the-art methods across various datasets.
arXiv Detail & Related papers (2024-12-18T10:56:40Z) - Analysis of Gene Regulatory Networks from Gene Expression Using Graph Neural Networks [0.4369058206183195]
This study explores the use of Graph Neural Networks (GNNs), a powerful approach for modeling graph-structured data like Gene Regulatory Networks (GRNs)
The model's adeptness in accurately predicting regulatory interactions and pinpointing key regulators is attributed to advanced attention mechanisms.
The integration of GNNs in GRN research is set to pioneer developments in personalized medicine, drug discovery, and our grasp of biological systems.
arXiv Detail & Related papers (2024-09-20T17:16:14Z) - Gene Regulatory Network Inference from Pre-trained Single-Cell Transcriptomics Transformer with Joint Graph Learning [10.44434676119443]
Inferring gene regulatory networks (GRNs) from single-cell RNA sequencing (scRNA-seq) data is a complex challenge.
In this study, we tackle this challenge by leveraging the single-cell BERT-based pre-trained transformer model (scBERT)
We introduce a novel joint graph learning approach that combines the rich contextual representations learned by single-cell language models with the structured knowledge encoded in GRNs.
arXiv Detail & Related papers (2024-07-25T16:42:08Z) - Stability Analysis of Non-Linear Classifiers using Gene Regulatory
Neural Network for Biological AI [2.0755366440393743]
We develop a mathematical model of gene-perceptron using a dual-layered transcription-translation chemical reaction model.
We perform stability analysis for each gene-perceptron within the fully-connected GRNN sub network to determine temporal as well as stable concentration outputs.
arXiv Detail & Related papers (2023-09-14T21:37:38Z) - Domain-adaptive Message Passing Graph Neural Network [67.35534058138387]
Cross-network node classification (CNNC) aims to classify nodes in a label-deficient target network by transferring the knowledge from a source network with abundant labels.
We propose a domain-adaptive message passing graph neural network (DM-GNN), which integrates graph neural network (GNN) with conditional adversarial domain adaptation.
arXiv Detail & Related papers (2023-08-31T05:26:08Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.