Interpretable neural architecture search and transfer learning for
understanding CRISPR/Cas9 off-target enzymatic reactions
- URL: http://arxiv.org/abs/2305.11917v2
- Date: Fri, 29 Sep 2023 16:36:45 GMT
- Title: Interpretable neural architecture search and transfer learning for
understanding CRISPR/Cas9 off-target enzymatic reactions
- Authors: Zijun Zhang, Adam R. Lamson, Michael Shelley, Olga Troyanskaya
- Abstract summary: Elektrum is a deep learning framework for determining the kinetics of biochemical systems.
It employs a novel transfer learning step, where the KINNs are inserted as intermediary layers into deeper convolutional neural networks.
We apply Elektrum to predict CRISPR-Cas9 off-target editing probabilities and demonstrate that it achieves state-of-the-art performance.
- Score: 3.5994174958472307
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Finely-tuned enzymatic pathways control cellular processes, and their
dysregulation can lead to disease. Creating predictive and interpretable models
for these pathways is challenging because of the complexity of the pathways and
of the cellular and genomic contexts. Here we introduce Elektrum, a deep
learning framework which addresses these challenges with data-driven and
biophysically interpretable models for determining the kinetics of biochemical
systems. First, it uses in vitro kinetic assays to rapidly hypothesize an
ensemble of high-quality Kinetically Interpretable Neural Networks (KINNs) that
predict reaction rates. It then employs a novel transfer learning step, where
the KINNs are inserted as intermediary layers into deeper convolutional neural
networks, fine-tuning the predictions for reaction-dependent in vivo outcomes.
Elektrum makes effective use of the limited, but clean in vitro data and the
complex, yet plentiful in vivo data that captures cellular context. We apply
Elektrum to predict CRISPR-Cas9 off-target editing probabilities and
demonstrate that Elektrum achieves state-of-the-art performance, regularizes
neural network architectures, and maintains physical interpretability.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Astrocytes as a mechanism for meta-plasticity and contextually-guided
network function [2.66269503676104]
Astrocytes are a ubiquitous and enigmatic type of non-neuronal cell.
Astrocytes may play a more direct and active role in brain function and neural computation.
arXiv Detail & Related papers (2023-11-06T20:31:01Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Evolving spiking neuron cellular automata and networks to emulate in
vitro neuronal activity [0.0]
We produce spiking neural systems that emulate the patterns of behavior of biological neurons in vitro.
Our models were able to produce a level of network-wide synchrony.
The genomes of the top-performing models indicate the excitability and density of connections in the model play an important role in determining the complexity of the produced activity.
arXiv Detail & Related papers (2021-10-15T17:55:04Z) - Modelling Neuronal Behaviour with Time Series Regression: Recurrent
Neural Networks on C. Elegans Data [0.0]
We show how the nervous system of C. Elegans can be modelled and simulated with data-driven models using different neural network architectures.
We show that GRU models with a hidden layer size of 4 units are able to accurately reproduce with high accuracy the system's response to very different stimuli.
arXiv Detail & Related papers (2021-07-01T10:39:30Z) - Towards an Automatic Analysis of CHO-K1 Suspension Growth in
Microfluidic Single-cell Cultivation [63.94623495501023]
We propose a novel Machine Learning architecture, which allows us to infuse a neural deep network with human-powered abstraction on the level of data.
Specifically, we train a generative model simultaneously on natural and synthetic data, so that it learns a shared representation, from which a target variable, such as the cell count, can be reliably estimated.
arXiv Detail & Related papers (2020-10-20T08:36:51Z) - Autonomous Discovery of Unknown Reaction Pathways from Data by Chemical
Reaction Neural Network [0.0]
Chemical reactions occur in energy, environmental, biological, and many other natural systems.
Here, we present a neural network approach that autonomously discovers reaction pathways from the time-resolved species concentration data.
The proposed Chemical Reaction Neural Network (CRNN), by design, satisfies the fundamental physics laws, including the Law of Mass Action and the Arrhenius Law.
arXiv Detail & Related papers (2020-02-20T23:36:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.