Neural scaling laws for phenotypic drug discovery
- URL: http://arxiv.org/abs/2309.16773v1
- Date: Thu, 28 Sep 2023 18:10:43 GMT
- Title: Neural scaling laws for phenotypic drug discovery
- Authors: Drew Linsley, John Griffin, Jason Parker Brown, Adam N Roose, Michael
Frank, Peter Linsley, Steven Finkbeiner, Jeremy Linsley
- Abstract summary: We investigate if scale can have a similar impact for models designed to aid small molecule drug discovery.
We find that DNNs explicitly supervised to solve tasks in the Pheno-CA do not continuously improve as their data and model size is scaled-up.
We introduce a novel precursor task, the Inverse Biological Process (IBP), which is designed to resemble the causal objective functions that have proven successful for NLP.
- Score: 3.076170146656896
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent breakthroughs by deep neural networks (DNNs) in natural language
processing (NLP) and computer vision have been driven by a scale-up of models
and data rather than the discovery of novel computing paradigms. Here, we
investigate if scale can have a similar impact for models designed to aid small
molecule drug discovery. We address this question through a large-scale and
systematic analysis of how DNN size, data diet, and learning routines interact
to impact accuracy on our Phenotypic Chemistry Arena (Pheno-CA) benchmark: a
diverse set of drug development tasks posed on image-based high content
screening data. Surprisingly, we find that DNNs explicitly supervised to solve
tasks in the Pheno-CA do not continuously improve as their data and model size
is scaled-up. To address this issue, we introduce a novel precursor task, the
Inverse Biological Process (IBP), which is designed to resemble the causal
objective functions that have proven successful for NLP. We indeed find that
DNNs first trained with IBP then probed for performance on the Pheno-CA
significantly outperform task-supervised DNNs. More importantly, the
performance of these IBP-trained DNNs monotonically improves with data and
model scale. Our findings reveal that the DNN ingredients needed to accurately
solve small molecule drug development tasks are already in our hands, and
project how much more experimental data is needed to achieve any desired level
of improvement. We release our Pheno-CA benchmark and code to encourage further
study of neural scaling laws for small molecule drug discovery.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Enabling energy-Efficient object detection with surrogate gradient
descent in spiking neural networks [0.40054215937601956]
Spiking Neural Networks (SNNs) are a biologically plausible neural network model with significant advantages in both event-driven processing and processing-temporal information.
In this study, we introduce the Current Mean Decoding (CMD) method, which solves the regression problem to facilitate the training of deep SNNs for object detection tasks.
Based on the gradient surrogate and CMD, we propose the SNN-YOLOv3 model for object detection.
arXiv Detail & Related papers (2023-09-07T15:48:00Z) - SAfER: Layer-Level Sensitivity Assessment for Efficient and Robust
Neural Network Inference [20.564198591600647]
Deep neural networks (DNNs) demonstrate outstanding performance across most computer vision tasks.
Some critical applications, such as autonomous driving or medical imaging, also require investigation into their behavior.
DNN attribution consists in studying the relationship between the predictions of a DNN and its inputs.
arXiv Detail & Related papers (2023-08-09T07:45:51Z) - Unsupervised Spiking Neural Network Model of Prefrontal Cortex to study
Task Switching with Synaptic deficiency [0.0]
We build a computational model of Prefrontal Cortex (PFC) using Spiking Neural Networks (SNN)
In this study, we use SNN's having parameters close to biologically plausible values and train the model using unsupervised Spike Timing Dependent Plasticity (STDP) learning rule.
arXiv Detail & Related papers (2023-05-23T05:59:54Z) - Transferability of coVariance Neural Networks and Application to
Interpretable Brain Age Prediction using Anatomical Features [119.45320143101381]
Graph convolutional networks (GCN) leverage topology-driven graph convolutional operations to combine information across the graph for inference tasks.
We have studied GCNs with covariance matrices as graphs in the form of coVariance neural networks (VNNs)
VNNs inherit the scale-free data processing architecture from GCNs and here, we show that VNNs exhibit transferability of performance over datasets whose covariance matrices converge to a limit object.
arXiv Detail & Related papers (2023-05-02T22:15:54Z) - Neuromorphic Data Augmentation for Training Spiking Neural Networks [10.303676184878896]
We propose neuromorphic data augmentation (NDA) for event-based datasets.
NDA significantly stabilizes the SNN training and reduces the generalization gap between training and test performance.
For the first time, we demonstrate the feasibility of unsupervised contrastive learning for SNNs.
arXiv Detail & Related papers (2022-03-11T18:17:19Z) - Quantitative Evaluation of Explainable Graph Neural Networks for
Molecular Property Prediction [2.8544822698499255]
Graph neural networks (GNNs) remain of limited acceptance in drug discovery due to their lack of interpretability.
In this work, we build three levels of benchmark datasets to quantitatively assess the interpretability of the state-of-the-art GNN models.
We implement recent XAI methods in combination with different GNN algorithms to highlight the benefits, limitations, and future opportunities for drug discovery.
arXiv Detail & Related papers (2021-07-01T04:49:29Z) - Ensemble Transfer Learning for the Prediction of Anti-Cancer Drug
Response [49.86828302591469]
In this paper, we apply transfer learning to the prediction of anti-cancer drug response.
We apply the classic transfer learning framework that trains a prediction model on the source dataset and refines it on the target dataset.
The ensemble transfer learning pipeline is implemented using LightGBM and two deep neural network (DNN) models with different architectures.
arXiv Detail & Related papers (2020-05-13T20:29:48Z) - Neural Additive Models: Interpretable Machine Learning with Neural Nets [77.66871378302774]
Deep neural networks (DNNs) are powerful black-box predictors that have achieved impressive performance on a wide variety of tasks.
We propose Neural Additive Models (NAMs) which combine some of the expressivity of DNNs with the inherent intelligibility of generalized additive models.
NAMs learn a linear combination of neural networks that each attend to a single input feature.
arXiv Detail & Related papers (2020-04-29T01:28:32Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Assessing Graph-based Deep Learning Models for Predicting Flash Point [52.931492216239995]
Graph-based deep learning (GBDL) models were implemented in predicting flash point for the first time.
Average R2 and Mean Absolute Error (MAE) scores of MPNN are, respectively, 2.3% lower and 2.0 K higher than previous comparable studies.
arXiv Detail & Related papers (2020-02-26T06:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.