Inferring probabilistic Boolean networks from steady-state gene data
samples
- URL: http://arxiv.org/abs/2211.05935v1
- Date: Fri, 11 Nov 2022 00:39:00 GMT
- Title: Inferring probabilistic Boolean networks from steady-state gene data
samples
- Authors: Vytenis \v{S}liogeris, Leandros Maglaras, Sotiris Moschoyiannis
- Abstract summary: We present a method for inferring PBNs directly from real gene expression data measurements taken when the system was at a steady state.
The proposed approach does not rely on reconstructing the state evolution of the network.
We demonstrate the method on samples of real gene expression profiling data from a well-known study on metastatic melanoma.
- Score: 0.6882042556551611
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic Boolean Networks have been proposed for estimating the
behaviour of dynamical systems as they combine rule-based modelling with
uncertainty principles. Inferring PBNs directly from gene data is challenging
however, especially when data is costly to collect and/or noisy, e.g., in the
case of gene expression profile data. In this paper, we present a reproducible
method for inferring PBNs directly from real gene expression data measurements
taken when the system was at a steady state. The steady-state dynamics of PBNs
is of special interest in the analysis of biological machinery. The proposed
approach does not rely on reconstructing the state evolution of the network,
which is computationally intractable for larger networks. We demonstrate the
method on samples of real gene expression profiling data from a well-known
study on metastatic melanoma. The pipeline is implemented using Python and we
make it publicly available.
Related papers
- Inferring biological processes with intrinsic noise from cross-sectional data [0.8192907805418583]
Inferring dynamical models from data continues to be a significant challenge in computational biology.
We show that probability flow inference (PFI) disentangles force from intrinsicity while retaining the algorithmic ease of ODE inference.
In practical applications, we show that PFI enables accurate parameter and force estimation in high-dimensional reaction networks, and that it allows inference of cell differentiation dynamics with molecular noise.
arXiv Detail & Related papers (2024-10-10T00:33:25Z) - Data-Driven Abstractions via Binary-Tree Gaussian Processes for Formal Verification [0.22499166814992438]
abstraction-based solutions based on Gaussian process (GP) regression have become popular for their ability to learn a representation of the latent system from data with a quantified error.
We show that the binary-tree Gaussian process (BTGP) allows us to construct an interval Markov chain model of the unknown system.
We provide a delocalized error quantification via a unified formula even when the true dynamics do not live in the function space of the BTGP.
arXiv Detail & Related papers (2024-07-15T11:49:44Z) - PhyloGFN: Phylogenetic inference with generative flow networks [57.104166650526416]
We introduce the framework of generative flow networks (GFlowNets) to tackle two core problems in phylogenetics: parsimony-based and phylogenetic inference.
Because GFlowNets are well-suited for sampling complex structures, they are a natural choice for exploring and sampling from the multimodal posterior distribution over tree topologies.
We demonstrate that our amortized posterior sampler, PhyloGFN, produces diverse and high-quality evolutionary hypotheses on real benchmark datasets.
arXiv Detail & Related papers (2023-10-12T23:46:08Z) - DynGFN: Towards Bayesian Inference of Gene Regulatory Networks with
GFlowNets [81.75973217676986]
Gene regulatory networks (GRN) describe interactions between genes and their products that control gene expression and cellular function.
Existing methods either focus on challenge (1), identifying cyclic structure from dynamics, or on challenge (2) learning complex Bayesian posteriors over DAGs, but not both.
In this paper we leverage the fact that it is possible to estimate the "velocity" of gene expression with RNA velocity techniques to develop an approach that addresses both challenges.
arXiv Detail & Related papers (2023-02-08T16:36:40Z) - An unfolding method based on conditional Invertible Neural Networks
(cINN) using iterative training [0.0]
Generative networks like invertible neural networks(INN) enable a probabilistic unfolding.
We introduce the iterative conditional INN(IcINN) for unfolding that adjusts for deviations between simulated training samples and data.
arXiv Detail & Related papers (2022-12-16T19:00:05Z) - Isoform Function Prediction Using a Deep Neural Network [9.507435239304591]
Studies have shown that more than 95% of human multi-exon genes have undergone alternative splicing.
Alternative splicing plays a significant role in human health and disease.
This project uses all Conditional data and valuable information such as mRNA sequences, expression profiles, and gene graphs.
arXiv Detail & Related papers (2022-08-05T09:31:25Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Probabilistic Numeric Convolutional Neural Networks [80.42120128330411]
Continuous input signals like images and time series that are irregularly sampled or have missing values are challenging for existing deep learning methods.
We propose Probabilistic Convolutional Neural Networks which represent features as Gaussian processes (GPs)
We then define a convolutional layer as the evolution of a PDE defined on this GP, followed by a nonlinearity.
In experiments we show that our approach yields a $3times$ reduction of error from the previous state of the art on the SuperPixel-MNIST dataset and competitive performance on the medical time2012 dataset PhysioNet.
arXiv Detail & Related papers (2020-10-21T10:08:21Z) - Bootstrapping Neural Processes [114.97111530885093]
Neural Processes (NPs) implicitly define a broad class of processes with neural networks.
NPs still rely on an assumption that uncertainty in processes is modeled by a single latent variable.
We propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap.
arXiv Detail & Related papers (2020-08-07T02:23:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.