Neural Design for Genetic Perturbation Experiments
- URL: http://arxiv.org/abs/2207.12805v1
- Date: Tue, 26 Jul 2022 10:59:52 GMT
- Title: Neural Design for Genetic Perturbation Experiments
- Authors: Aldo Pacchiano, Drausin Wulsin, Robert A. Barton, Luis Voloch
- Abstract summary: We introduce the Optimistic Arm Elimination principle to find an almost optimal arm under different functional relationships between the queries (arms) and the outputs (rewards)
OAE also outperforms the benchmark algorithms in 3 of 4 datasets in the GeneDisco experimental planning challenge.
- Score: 16.95249173404529
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The problem of how to genetically modify cells in order to maximize a certain
cellular phenotype has taken center stage in drug development over the last few
years (with, for example, genetically edited CAR-T, CAR-NK, and CAR-NKT cells
entering cancer clinical trials). Exhausting the search space for all possible
genetic edits (perturbations) or combinations thereof is infeasible due to cost
and experimental limitations. This work provides a theoretically sound
framework for iteratively exploring the space of perturbations in pooled
batches in order to maximize a target phenotype under an experimental budget.
Inspired by this application domain, we study the problem of batch query bandit
optimization and introduce the Optimistic Arm Elimination ($\mathrm{OAE}$)
principle designed to find an almost optimal arm under different functional
relationships between the queries (arms) and the outputs (rewards). We analyze
the convergence properties of $\mathrm{OAE}$ by relating it to the Eluder
dimension of the algorithm's function class and validate that $\mathrm{OAE}$
outperforms other strategies in finding optimal actions in experiments on
simulated problems, public datasets well-studied in bandit contexts, and in
genetic perturbation datasets when the regression model is a deep neural
network. OAE also outperforms the benchmark algorithms in 3 of 4 datasets in
the GeneDisco experimental planning challenge.
Related papers
- Semantically Rich Local Dataset Generation for Explainable AI in Genomics [0.716879432974126]
Black box deep learning models trained on genomic sequences excel at predicting the outcomes of different gene regulatory mechanisms.
We propose using Genetic Programming to generate datasets by evolving perturbations in sequences that contribute to their semantic diversity.
arXiv Detail & Related papers (2024-07-03T10:31:30Z) - Predicting loss-of-function impact of genetic mutations: a machine
learning approach [0.0]
This paper aims to train machine learning models on the attributes of a genetic mutation to predict LoFtool scores.
These attributes included, but were not limited to, the position of a mutation on a chromosome, changes in amino acids, and changes in codons caused by the mutation.
Models were evaluated using five-fold cross-validated averages of r-squared, mean squared error, root mean squared error, mean absolute error, and explained variance.
arXiv Detail & Related papers (2024-01-26T19:27:38Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - Adaptive LASSO estimation for functional hidden dynamic geostatistical
model [69.10717733870575]
We propose a novel model selection algorithm based on a penalized maximum likelihood estimator (PMLE) for functional hiddenstatistical models (f-HD)
The algorithm is based on iterative optimisation and uses an adaptive least absolute shrinkage and selector operator (GMSOLAS) penalty function, wherein the weights are obtained by the unpenalised f-HD maximum-likelihood estimators.
arXiv Detail & Related papers (2022-08-10T19:17:45Z) - Inference of Regulatory Networks Through Temporally Sparse Data [5.495223636885796]
A major goal in genomics is to properly capture the complex dynamical behaviors of gene regulatory networks (GRNs)
This paper develops a scalable and efficient topology inference for GRNs using Bayesian optimization and kernel-based methods.
arXiv Detail & Related papers (2022-07-21T22:48:12Z) - Result Diversification by Multi-objective Evolutionary Algorithms with
Theoretical Guarantees [94.72461292387146]
We propose to reformulate the result diversification problem as a bi-objective search problem, and solve it by a multi-objective evolutionary algorithm (EA)
We theoretically prove that the GSEMO can achieve the optimal-time approximation ratio, $1/2$.
When the objective function changes dynamically, the GSEMO can maintain this approximation ratio in running time, addressing the open question proposed by Borodin et al.
arXiv Detail & Related papers (2021-10-18T14:00:22Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Brain Image Synthesis with Unsupervised Multivariate Canonical
CSC$\ell_4$Net [122.8907826672382]
We propose to learn dedicated features that cross both intre- and intra-modal variations using a novel CSC$ell_4$Net.
arXiv Detail & Related papers (2021-03-22T05:19:40Z) - GeneCAI: Genetic Evolution for Acquiring Compact AI [36.04715576228068]
Deep Neural Networks (DNNs) are evolving towards more complex architectures to achieve higher inference accuracy.
Model compression techniques can be leveraged to efficiently deploy such compute-intensive architectures on resource-limited mobile devices.
This paper introduces GeneCAI, a novel optimization method that automatically learns how to tune per-layer compression hyper- parameters.
arXiv Detail & Related papers (2020-04-08T20:56:37Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.