Interpretable Neural ODEs for Gene Regulatory Network Discovery under Perturbations
- URL: http://arxiv.org/abs/2501.02409v2
- Date: Sat, 01 Feb 2025 05:30:20 GMT
- Title: Interpretable Neural ODEs for Gene Regulatory Network Discovery under Perturbations
- Authors: Zaikang Lin, Sei Chang, Aaron Zweig, Minseo Kang, Elham Azizi, David A. Knowles,
- Abstract summary: We propose PerturbODE, a novel framework that incorporates biologically informative neural ordinary differential equations (neural ODEs) to model cell state trajectories under perturbations.
We demonstrate PerturbODE's efficacy in trajectory prediction and GRN inference across simulated and real over-expression datasets.
- Score: 4.34315395377214
- License:
- Abstract: Modern high-throughput biological datasets with thousands of perturbations provide the opportunity for large-scale discovery of causal graphs that represent the regulatory interactions between genes. Differentiable causal graphical models have been proposed to infer a gene regulatory network (GRN) from large scale interventional datasets, capturing the causal gene regulatory relationships from genetic perturbations. However, existing models are limited in their expressivity and scalability while failing to address the dynamic nature of biological processes such as cellular differentiation. We propose PerturbODE, a novel framework that incorporates biologically informative neural ordinary differential equations (neural ODEs) to model cell state trajectories under perturbations and derive the causal GRN from the neural ODE's parameters. We demonstrate PerturbODE's efficacy in trajectory prediction and GRN inference across simulated and real over-expression datasets.
Related papers
- GENERator: A Long-Context Generative Genomic Foundation Model [66.46537421135996]
We present a generative genomic foundation model featuring a context length of 98k base pairs (bp) and 1.2B parameters.
The model adheres to the central dogma of molecular biology, accurately generating protein-coding sequences.
It also shows significant promise in sequence optimization, particularly through the prompt-responsive generation of promoter sequences.
arXiv Detail & Related papers (2025-02-11T05:39:49Z) - Diffusion-Based Generation of Neural Activity from Disentangled Latent Codes [1.9544534628180867]
We propose a new approach to neural data analysis that leverages advances in conditional generative modeling.
We apply our model, called Generating Neural Observations Conditioned on Codes with High Information, to time series neural data.
In comparison to a VAE-based sequential autoencoder, GNOCCHI learns higher-quality latent spaces that are more clearly structured and more disentangled with respect to key behavioral variables.
arXiv Detail & Related papers (2024-07-30T21:07:09Z) - Inference of dynamical gene regulatory networks from single-cell data
with physics informed neural networks [0.0]
We show how physics-informed neural networks (PINNs) can be used to infer the parameters of predictive, dynamical GRNs.
Specifically we study GRNs that exhibit bifurcation behavior and can therefore model cell differentiation.
arXiv Detail & Related papers (2024-01-14T21:43:10Z) - Causal machine learning for single-cell genomics [94.28105176231739]
We discuss the application of machine learning techniques to single-cell genomics and their challenges.
We first present the model that underlies most of current causal approaches to single-cell biology.
We then identify open problems in the application of causal approaches to single-cell data.
arXiv Detail & Related papers (2023-10-23T13:35:24Z) - Causal Inference in Gene Regulatory Networks with GFlowNet: Towards
Scalability in Large Systems [87.45270862120866]
We introduce Swift-DynGFN as a novel framework that enhances causal structure learning in GRNs.
Specifically, Swift-DynGFN exploits gene-wise independence to boost parallelization and to lower computational cost.
arXiv Detail & Related papers (2023-10-05T14:59:19Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - DiscoGen: Learning to Discover Gene Regulatory Networks [30.83574314774383]
Accurately inferring Gene Regulatory Networks (GRNs) is a critical and challenging task in biology.
Recent advances in neural network-based causal discovery methods have significantly improved causal discovery.
Applying state-of-the-art causal discovery methods in biology poses challenges, such as noisy data and a large number of samples.
We introduce DiscoGen, a neural network-based GRN discovery method that can denoise gene expression measurements and handle interventional data.
arXiv Detail & Related papers (2023-04-12T13:02:49Z) - Granger causal inference on DAGs identifies genomic loci regulating
transcription [77.58911272503771]
GrID-Net is a framework based on graph neural networks with lagged message passing for Granger causal inference on DAG-structured systems.
Our application is the analysis of single-cell multimodal data to identify genomic loci that mediate the regulation of specific genes.
arXiv Detail & Related papers (2022-10-18T21:15:10Z) - Neural network facilitated ab initio derivation of linear formula: A
case study on formulating the relationship between DNA motifs and gene
expression [8.794181445664243]
We propose a framework for ab initio derivation of sequence motifs and linear formula using a new approach based on the interpretable neural network model.
We showed that this linear model could predict gene expression levels using promoter sequences with a performance comparable to deep neural network models.
arXiv Detail & Related papers (2022-08-19T22:29:30Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Gene Regulatory Network Inference with Latent Force Models [1.2691047660244335]
Delays in protein synthesis cause a confounding effect when constructing Gene Regulatory Networks (GRNs) from RNA-sequencing time-series data.
We present a model which incorporates translation delays by combining mechanistic equations and Bayesian approaches to fit to experimental data.
arXiv Detail & Related papers (2020-10-06T09:03:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.