Prediction of gene expression time series and structural analysis of
gene regulatory networks using recurrent neural networks
- URL: http://arxiv.org/abs/2109.05849v1
- Date: Mon, 13 Sep 2021 10:30:21 GMT
- Title: Prediction of gene expression time series and structural analysis of
gene regulatory networks using recurrent neural networks
- Authors: Michele Monti, Jonathan Fiorentino, Edoardo Milanetti, Giorgio Gosti,
Gian Gaetano Tartaglia
- Abstract summary: This work provides a a way to understand and exploit the attention mechanism of RNN.
It paves the way to RNN-based methods for time series prediction and inference of GRNs from gene expression data.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Methods for time series prediction and classification of gene regulatory
networks (GRNs) from gene expression data have been treated separately so far.
The recent emergence of attention-based recurrent neural networks (RNN) models
boosted the interpretability of RNN parameters, making them appealing for the
understanding of gene interactions. In this work, we generated synthetic time
series gene expression data from a range of archetypal GRNs and we relied on a
dual attention RNN to predict the gene temporal dynamics. We show that the
prediction is extremely accurate for GRNs with different architectures. Next,
we focused on the attention mechanism of the RNN and, using tools from graph
theory, we found that its graph properties allow to hierarchically distinguish
different architectures of the GRN. We show that the GRNs respond differently
to the addition of noise in the prediction by the RNN and we relate the noise
response to the analysis of the attention mechanism. In conclusion, this work
provides a a way to understand and exploit the attention mechanism of RNN and
it paves the way to RNN-based methods for time series prediction and inference
of GRNs from gene expression data.
Related papers
- Analysis of Gene Regulatory Networks from Gene Expression Using Graph Neural Networks [0.4369058206183195]
This study explores the use of Graph Neural Networks (GNNs), a powerful approach for modeling graph-structured data like Gene Regulatory Networks (GRNs)
The model's adeptness in accurately predicting regulatory interactions and pinpointing key regulators is attributed to advanced attention mechanisms.
The integration of GNNs in GRN research is set to pioneer developments in personalized medicine, drug discovery, and our grasp of biological systems.
arXiv Detail & Related papers (2024-09-20T17:16:14Z) - Inference of dynamical gene regulatory networks from single-cell data
with physics informed neural networks [0.0]
We show how physics-informed neural networks (PINNs) can be used to infer the parameters of predictive, dynamical GRNs.
Specifically we study GRNs that exhibit bifurcation behavior and can therefore model cell differentiation.
arXiv Detail & Related papers (2024-01-14T21:43:10Z) - Neural Tangent Kernels Motivate Graph Neural Networks with
Cross-Covariance Graphs [94.44374472696272]
We investigate NTKs and alignment in the context of graph neural networks (GNNs)
Our results establish the theoretical guarantees on the optimality of the alignment for a two-layer GNN.
These guarantees are characterized by the graph shift operator being a function of the cross-covariance between the input and the output data.
arXiv Detail & Related papers (2023-10-16T19:54:21Z) - Stability Analysis of Non-Linear Classifiers using Gene Regulatory
Neural Network for Biological AI [2.0755366440393743]
We develop a mathematical model of gene-perceptron using a dual-layered transcription-translation chemical reaction model.
We perform stability analysis for each gene-perceptron within the fully-connected GRNN sub network to determine temporal as well as stable concentration outputs.
arXiv Detail & Related papers (2023-09-14T21:37:38Z) - On Neural Networks as Infinite Tree-Structured Probabilistic Graphical
Models [47.91322568623835]
We propose an innovative solution by constructing infinite tree-structured PGMs that correspond exactly to neural networks.
Our research reveals that DNNs, during forward propagation, indeed perform approximations of PGM inference that are precise in this alternative PGM structure.
arXiv Detail & Related papers (2023-05-27T21:32:28Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Gated Graph Recurrent Neural Networks [176.3960927323358]
We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework for graph processes.
To address the problem of vanishing gradients, we put forward GRNNs with three different gating mechanisms: time, node and edge gates.
The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.
arXiv Detail & Related papers (2020-02-03T22:35:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.