Ten Quick Tips for Deep Learning in Biology
- URL: http://arxiv.org/abs/2105.14372v1
- Date: Sat, 29 May 2021 21:02:44 GMT
- Title: Ten Quick Tips for Deep Learning in Biology
- Authors: Benjamin D. Lee, Anthony Gitter, Casey S. Greene, Sebastian Raschka,
Finlay Maguire, Alexander J. Titus, Michael D. Kessler, Alexandra J. Lee,
Marc G. Chevrette, Paul Allen Stewart, Thiago Britto-Borges, Evan M. Cofer,
Kun-Hsing Yu, Juan Jose Carmona, Elana J. Fertig, Alexandr A. Kalinin, Beth
Signal, Benjamin J. Lengerich, Timothy J. Triche Jr, Simina M. Boca
- Abstract summary: Machine learning is concerned with the development and applications of algorithms that can recognize patterns in data and use them for predictive modeling.
Deep learning has become its own subfield of machine learning.
In the context of biological research, deep learning has been increasingly used to derive novel insights from high-dimensional biological data.
- Score: 116.78436313026478
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning is a modern approach to problem-solving and task automation.
In particular, machine learning is concerned with the development and
applications of algorithms that can recognize patterns in data and use them for
predictive modeling. Artificial neural networks are a particular class of
machine learning algorithms and models that evolved into what is now described
as deep learning. Given the computational advances made in the last decade,
deep learning can now be applied to massive data sets and in innumerable
contexts. Therefore, deep learning has become its own subfield of machine
learning. In the context of biological research, it has been increasingly used
to derive novel insights from high-dimensional biological data. To make the
biological applications of deep learning more accessible to scientists who have
some experience with machine learning, we solicited input from a community of
researchers with varied biological and deep learning interests. These
individuals collaboratively contributed to this manuscript's writing using the
GitHub version control platform and the Manubot manuscript generation toolset.
The goal was to articulate a practical, accessible, and concise set of
guidelines and suggestions to follow when using deep learning. In the course of
our discussions, several themes became clear: the importance of understanding
and applying machine learning fundamentals as a baseline for utilizing deep
learning, the necessity for extensive model comparisons with careful
evaluation, and the need for critical thought in interpreting results generated
by deep learning, among others.
Related papers
- EndToEndML: An Open-Source End-to-End Pipeline for Machine Learning Applications [0.2826977330147589]
We propose a web-based end-to-end pipeline that is capable of preprocessing, training, evaluating, and visualizing machine learning models.
Our library assists in recognizing, classifying, clustering, and predicting a wide range of multi-modal, multi-sensor datasets.
arXiv Detail & Related papers (2024-03-27T02:24:38Z) - Language Evolution with Deep Learning [49.879239655532324]
Computational modeling plays an essential role in the study of language emergence.
It aims to simulate the conditions and learning processes that could trigger the emergence of a structured language.
This chapter explores another class of computational models that have recently revolutionized the field of machine learning: deep learning models.
arXiv Detail & Related papers (2024-03-18T16:52:54Z) - Breaking the Curse of Dimensionality in Deep Neural Networks by Learning
Invariant Representations [1.9580473532948401]
This thesis explores the theoretical foundations of deep learning by studying the relationship between the architecture of these models and the inherent structures found within the data they process.
We ask What drives the efficacy of deep learning algorithms and allows them to beat the so-called curse of dimensionality.
Our methodology takes an empirical approach to deep learning, combining experimental studies with physics-inspired toy models.
arXiv Detail & Related papers (2023-10-24T19:50:41Z) - Deep Learning in Deterministic Computational Mechanics [0.0]
This review focuses on deep learning methods rather than applications for computational mechanics.
The primary audience is researchers at the verge of entering this field or those who attempt to gain an overview of deep learning in computational mechanics.
arXiv Detail & Related papers (2023-09-27T05:57:19Z) - Brain-Inspired Computational Intelligence via Predictive Coding [89.6335791546526]
Predictive coding (PC) has shown promising performance in machine intelligence tasks.
PC can model information processing in different brain areas, can be used in cognitive control and robotics.
arXiv Detail & Related papers (2023-08-15T16:37:16Z) - Deep Active Learning for Computer Vision: Past and Future [50.19394935978135]
Despite its indispensable role for developing AI models, research on active learning is not as intensive as other research directions.
By addressing data automation challenges and coping with automated machine learning systems, active learning will facilitate democratization of AI technologies.
arXiv Detail & Related papers (2022-11-27T13:07:14Z) - Discussion of Ensemble Learning under the Era of Deep Learning [4.061135251278187]
Ensemble deep learning has shown significant performances in improving the generalization of learning system.
Time and space overheads for training multiple base deep learners and testing with the ensemble deep learner are far greater than that of traditional ensemble learning.
An urgent problem needs to be solved is how to take the significant advantages of ensemble deep learning while reduce the required time and space overheads.
arXiv Detail & Related papers (2021-01-21T01:33:23Z) - Knowledge as Invariance -- History and Perspectives of
Knowledge-augmented Machine Learning [69.99522650448213]
Research in machine learning is at a turning point.
Research interests are shifting away from increasing the performance of highly parameterized models to exceedingly specific tasks.
This white paper provides an introduction and discussion of this emerging field in machine learning research.
arXiv Detail & Related papers (2020-12-21T15:07:19Z) - Memristors -- from In-memory computing, Deep Learning Acceleration,
Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired
Computing [25.16076541420544]
Machine learning, particularly in the form of deep learning, has driven most of the recent fundamental developments in artificial intelligence.
Deep learning has been successfully applied in areas such as object/pattern recognition, speech and natural language processing, self-driving vehicles, intelligent self-diagnostics tools, autonomous robots, knowledgeable personal assistants, and monitoring.
This paper reviews the case for a novel beyond CMOS hardware technology, memristors, as a potential solution for the implementation of power-efficient in-memory computing, deep learning accelerators, and spiking neural networks.
arXiv Detail & Related papers (2020-04-30T16:49:03Z) - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch [76.83052807776276]
We show that it is possible to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks.
We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space.
We believe these preliminary successes in discovering machine learning algorithms from scratch indicate a promising new direction in the field.
arXiv Detail & Related papers (2020-03-06T19:00:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.