Inference of dynamical gene regulatory networks from single-cell data
with physics informed neural networks
- URL: http://arxiv.org/abs/2401.07379v1
- Date: Sun, 14 Jan 2024 21:43:10 GMT
- Title: Inference of dynamical gene regulatory networks from single-cell data
with physics informed neural networks
- Authors: Maria Mircea, Diego Garlaschelli, Stefan Semrau
- Abstract summary: We show how physics-informed neural networks (PINNs) can be used to infer the parameters of predictive, dynamical GRNs.
Specifically we study GRNs that exhibit bifurcation behavior and can therefore model cell differentiation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: One of the main goals of developmental biology is to reveal the gene
regulatory networks (GRNs) underlying the robust differentiation of multipotent
progenitors into precisely specified cell types. Most existing methods to infer
GRNs from experimental data have limited predictive power as the inferred GRNs
merely reflect gene expression similarity or correlation. Here, we demonstrate,
how physics-informed neural networks (PINNs) can be used to infer the
parameters of predictive, dynamical GRNs that provide mechanistic understanding
of biological processes. Specifically we study GRNs that exhibit bifurcation
behavior and can therefore model cell differentiation. We show that PINNs
outperform regular feed-forward neural networks on the parameter inference task
and analyze two relevant experimental scenarios: 1. a system with cell
communication for which gene expression trajectories are available and 2.
snapshot measurements of a cell population in which cell communication is
absent. Our analysis will inform the design of future experiments to be
analyzed with PINNs and provides a starting point to explore this powerful
class of neural network models further.
Related papers
- Analysis of Gene Regulatory Networks from Gene Expression Using Graph Neural Networks [0.4369058206183195]
This study explores the use of Graph Neural Networks (GNNs), a powerful approach for modeling graph-structured data like Gene Regulatory Networks (GRNs)
The model's adeptness in accurately predicting regulatory interactions and pinpointing key regulators is attributed to advanced attention mechanisms.
The integration of GNNs in GRN research is set to pioneer developments in personalized medicine, drug discovery, and our grasp of biological systems.
arXiv Detail & Related papers (2024-09-20T17:16:14Z) - Gene Regulatory Network Inference from Pre-trained Single-Cell Transcriptomics Transformer with Joint Graph Learning [10.44434676119443]
Inferring gene regulatory networks (GRNs) from single-cell RNA sequencing (scRNA-seq) data is a complex challenge.
In this study, we tackle this challenge by leveraging the single-cell BERT-based pre-trained transformer model (scBERT)
We introduce a novel joint graph learning approach that combines the rich contextual representations learned by single-cell language models with the structured knowledge encoded in GRNs.
arXiv Detail & Related papers (2024-07-25T16:42:08Z) - Stability Analysis of Non-Linear Classifiers using Gene Regulatory
Neural Network for Biological AI [2.0755366440393743]
We develop a mathematical model of gene-perceptron using a dual-layered transcription-translation chemical reaction model.
We perform stability analysis for each gene-perceptron within the fully-connected GRNN sub network to determine temporal as well as stable concentration outputs.
arXiv Detail & Related papers (2023-09-14T21:37:38Z) - Inferring Gene Regulatory Neural Networks for Bacterial Decision Making
in Biofilms [4.459301404374565]
Bacterial cells are sensitive to a range of external signals used to learn the environment.
An inherited Gene Regulatory Neural Network (GRNN) behavior enables the cellular decision-making.
GRNNs can perform computational tasks for bio-hybrid computing systems.
arXiv Detail & Related papers (2023-01-10T22:07:33Z) - Granger causal inference on DAGs identifies genomic loci regulating
transcription [77.58911272503771]
GrID-Net is a framework based on graph neural networks with lagged message passing for Granger causal inference on DAG-structured systems.
Our application is the analysis of single-cell multimodal data to identify genomic loci that mediate the regulation of specific genes.
arXiv Detail & Related papers (2022-10-18T21:15:10Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Prediction of gene expression time series and structural analysis of
gene regulatory networks using recurrent neural networks [0.0]
This work provides a a way to understand and exploit the attention mechanism of RNN.
It paves the way to RNN-based methods for time series prediction and inference of GRNs from gene expression data.
arXiv Detail & Related papers (2021-09-13T10:30:21Z) - How Neural Networks Extrapolate: From Feedforward to Graph Neural
Networks [80.55378250013496]
We study how neural networks trained by gradient descent extrapolate what they learn outside the support of the training distribution.
Graph Neural Networks (GNNs) have shown some success in more complex tasks.
arXiv Detail & Related papers (2020-09-24T17:48:59Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.