Inferring Signaling Pathways with Probabilistic Programming
- URL: http://arxiv.org/abs/2005.14062v2
- Date: Fri, 17 Jul 2020 22:15:36 GMT
- Title: Inferring Signaling Pathways with Probabilistic Programming
- Authors: David Merrell, Anthony Gitter
- Abstract summary: We implement our method, named Sparse Signaling Pathway Sampling, in Julia using the Gen probabilistic programming language.
We evaluate our algorithm on simulated data and the HPN-DREAM pathway reconstruction challenge.
Our results demonstrate the vast potential for probabilistic programming, and Gen specifically, for biological network inference.
- Score: 1.8275108630751837
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cells regulate themselves via dizzyingly complex biochemical processes called
signaling pathways. These are usually depicted as a network, where nodes
represent proteins and edges indicate their influence on each other. In order
to understand diseases and therapies at the cellular level, it is crucial to
have an accurate understanding of the signaling pathways at work. Since
signaling pathways can be modified by disease, the ability to infer signaling
pathways from condition- or patient-specific data is highly valuable. A variety
of techniques exist for inferring signaling pathways. We build on past works
that formulate signaling pathway inference as a Dynamic Bayesian Network
structure estimation problem on phosphoproteomic time course data. We take a
Bayesian approach, using Markov Chain Monte Carlo to estimate a posterior
distribution over possible Dynamic Bayesian Network structures. Our primary
contributions are (i) a novel proposal distribution that efficiently samples
sparse graphs and (ii) the relaxation of common restrictive modeling
assumptions. We implement our method, named Sparse Signaling Pathway Sampling,
in Julia using the Gen probabilistic programming language. Probabilistic
programming is a powerful methodology for building statistical models. The
resulting code is modular, extensible, and legible. The Gen language, in
particular, allows us to customize our inference procedure for biological
graphs and ensure efficient sampling. We evaluate our algorithm on simulated
data and the HPN-DREAM pathway reconstruction challenge, comparing our
performance against a variety of baseline methods. Our results demonstrate the
vast potential for probabilistic programming, and Gen specifically, for
biological network inference. Find the full codebase at
https://github.com/gitter-lab/ssps
Related papers
- Learning to solve Bayesian inverse problems: An amortized variational inference approach using Gaussian and Flow guides [0.0]
We develop a methodology that enables real-time inference by learning the Bayesian inverse map, i.e., the map from data to posteriors.
Our approach provides the posterior distribution for a given observation just at the cost of a forward pass of the neural network.
arXiv Detail & Related papers (2023-05-31T16:25:07Z) - DynGFN: Towards Bayesian Inference of Gene Regulatory Networks with
GFlowNets [81.75973217676986]
Gene regulatory networks (GRN) describe interactions between genes and their products that control gene expression and cellular function.
Existing methods either focus on challenge (1), identifying cyclic structure from dynamics, or on challenge (2) learning complex Bayesian posteriors over DAGs, but not both.
In this paper we leverage the fact that it is possible to estimate the "velocity" of gene expression with RNA velocity techniques to develop an approach that addresses both challenges.
arXiv Detail & Related papers (2023-02-08T16:36:40Z) - Inferring probabilistic Boolean networks from steady-state gene data
samples [0.6882042556551611]
We present a method for inferring PBNs directly from real gene expression data measurements taken when the system was at a steady state.
The proposed approach does not rely on reconstructing the state evolution of the network.
We demonstrate the method on samples of real gene expression profiling data from a well-known study on metastatic melanoma.
arXiv Detail & Related papers (2022-11-11T00:39:00Z) - Diffusion Models for Causal Discovery via Topological Ordering [20.875222263955045]
emphTopological ordering approaches reduce the optimisation space of causal discovery by searching over a permutation rather than graph space.
For ANMs, the emphHessian of the data log-likelihood can be used for finding leaf nodes in a causal graph, allowing its topological ordering.
We introduce theory for updating the learned Hessian without re-training the neural network, and we show that computing with a subset of samples gives an accurate approximation of the ordering.
arXiv Detail & Related papers (2022-10-12T13:36:29Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Probabilistic Numeric Convolutional Neural Networks [80.42120128330411]
Continuous input signals like images and time series that are irregularly sampled or have missing values are challenging for existing deep learning methods.
We propose Probabilistic Convolutional Neural Networks which represent features as Gaussian processes (GPs)
We then define a convolutional layer as the evolution of a PDE defined on this GP, followed by a nonlinearity.
In experiments we show that our approach yields a $3times$ reduction of error from the previous state of the art on the SuperPixel-MNIST dataset and competitive performance on the medical time2012 dataset PhysioNet.
arXiv Detail & Related papers (2020-10-21T10:08:21Z) - Activation Relaxation: A Local Dynamical Approximation to
Backpropagation in the Brain [62.997667081978825]
Activation Relaxation (AR) is motivated by constructing the backpropagation gradient as the equilibrium point of a dynamical system.
Our algorithm converges rapidly and robustly to the correct backpropagation gradients, requires only a single type of computational unit, and can operate on arbitrary computation graphs.
arXiv Detail & Related papers (2020-09-11T11:56:34Z) - Coverage probability in wireless networks with determinantal scheduling [1.4502611532302039]
We propose a new class of algorithms for randomly scheduling network transmissions.
We show that, similarly to Aloha, they are also subject to elegant analysis of the coverage probabilities and transmission attempts.
arXiv Detail & Related papers (2020-06-09T04:05:50Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Neural Enhanced Belief Propagation on Factor Graphs [85.61562052281688]
A graphical model is a structured representation of locally dependent random variables.
We first extend graph neural networks to factor graphs (FG-GNN)
We then propose a new hybrid model that runs conjointly a FG-GNN with belief propagation.
arXiv Detail & Related papers (2020-03-04T11:03:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.