Deep Learning in Science
- URL: http://arxiv.org/abs/2009.01575v2
- Date: Fri, 4 Sep 2020 12:17:03 GMT
- Title: Deep Learning in Science
- Authors: Stefano Bianchini, Moritz M\"uller and Pierre Pelletier
- Abstract summary: This paper provides insights on the diffusion and impact of Deep Learning in science.
We use a Natural Language Processing (NLP) approach on the arXiv.org publication corpus.
Our findings suggest that DL does not (yet?) work as an autopilot to navigate complex knowledge landscapes and overthrow their structure.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Much of the recent success of Artificial Intelligence (AI) has been spurred
on by impressive achievements within a broader family of machine learning
methods, commonly referred to as Deep Learning (DL). This paper provides
insights on the diffusion and impact of DL in science. Through a Natural
Language Processing (NLP) approach on the arXiv.org publication corpus, we
delineate the emerging DL technology and identify a list of relevant search
terms. These search terms allow us to retrieve DL-related publications from Web
of Science across all sciences. Based on that sample, we document the DL
diffusion process in the scientific system. We find i) an exponential growth in
the adoption of DL as a research tool across all sciences and all over the
world, ii) regional differentiation in DL application domains, and iii) a
transition from interdisciplinary DL applications to disciplinary research
within application domains. In a second step, we investigate how the adoption
of DL methods affects scientific development. Therefore, we empirically assess
how DL adoption relates to re-combinatorial novelty and scientific impact in
the health sciences. We find that DL adoption is negatively correlated with
re-combinatorial novelty, but positively correlated with expectation as well as
variance of citation performance. Our findings suggest that DL does not (yet?)
work as an autopilot to navigate complex knowledge landscapes and overthrow
their structure. However, the 'DL principle' qualifies for its versatility as
the nucleus of a general scientific method that advances science in a
measurable way.
Related papers
- Science-Informed Deep Learning (ScIDL) With Applications to Wireless Communications [11.472232944923558]
This article provides a tutorial on science-informed deep learning (ScIDL)
ScIDL aims to integrate existing scientific knowledge with DL techniques to develop more powerful algorithms.
We discuss both recent applications of ScIDL and potential future research directions in the field of wireless communications.
arXiv Detail & Related papers (2024-06-29T02:35:39Z) - A Comprehensive Survey of Scientific Large Language Models and Their Applications in Scientific Discovery [68.48094108571432]
Large language models (LLMs) have revolutionized the way text and other modalities of data are handled.
We aim to provide a more holistic view of the research landscape by unveiling cross-field and cross-modal connections between scientific LLMs.
arXiv Detail & Related papers (2024-06-16T08:03:24Z) - Mapping the Increasing Use of LLMs in Scientific Papers [99.67983375899719]
We conduct the first systematic, large-scale analysis across 950,965 papers published between January 2020 and February 2024 on the arXiv, bioRxiv, and Nature portfolio journals.
Our findings reveal a steady increase in LLM usage, with the largest and fastest growth observed in Computer Science papers.
arXiv Detail & Related papers (2024-04-01T17:45:15Z) - Scientific Large Language Models: A Survey on Biological & Chemical Domains [47.97810890521825]
Large Language Models (LLMs) have emerged as a transformative power in enhancing natural language comprehension.
The application of LLMs extends beyond conventional linguistic boundaries, encompassing specialized linguistic systems developed within various scientific disciplines.
As a burgeoning area in the community of AI for Science, scientific LLMs warrant comprehensive exploration.
arXiv Detail & Related papers (2024-01-26T05:33:34Z) - Deep Learning in Healthcare: An In-Depth Analysis [1.892561703051693]
We provide a review of Deep Learning models and their broad application in bioinformatics and healthcare.
We also go over some of the key challenges that still exist and can show up while conducting DL research.
arXiv Detail & Related papers (2023-02-12T20:55:34Z) - Modeling Information Change in Science Communication with Semantically
Matched Paraphrases [50.67030449927206]
SPICED is the first paraphrase dataset of scientific findings annotated for degree of information change.
SPICED contains 6,000 scientific finding pairs extracted from news stories, social media discussions, and full texts of original papers.
Models trained on SPICED improve downstream performance on evidence retrieval for fact checking of real-world scientific claims.
arXiv Detail & Related papers (2022-10-24T07:44:38Z) - Machine Learning vs. Deep Learning in 5G Networks -- A Comparison of
Scientific Impact [0.0]
Machine learning (ML) and deep learning (DL) techniques are used in 5G networks.
Our study aims to uncover the differences in scientific impact for these two techniques by the means of statistical bibliometrics.
Web of Science (WoS) database host 2245 papers for ML and 1407 papers for DL-related studies.
arXiv Detail & Related papers (2022-10-13T19:54:17Z) - Zeroth-Order SciML: Non-intrusive Integration of Scientific Software
with Deep Learning [46.924429562606086]
We propose to integrate the abundance of scientific knowledge sources (SKS) with the deep learning (DL) training process.
Existing knowledge integration approaches are limited to using differentiable knowledge source to be compatible with first-order DL training paradigm.
We show that proposed scheme is able to effectively integrate scientific knowledge with DL training and is able to outperform purely data-driven model for data-limited scientific applications.
arXiv Detail & Related papers (2022-06-04T17:52:42Z) - An overview of deep learning in medical imaging [0.0]
Deep learning (DL) systems are cutting-edge ML systems spanning a broad range of disciplines.
Recent advances can bring tremendous improvement to the medical field.
Recent developments with relevant problems in the field of DL used for medical imaging has been provided.
arXiv Detail & Related papers (2022-02-17T09:44:57Z) - A Survey of Deep Active Learning [54.376820959917005]
Active learning (AL) attempts to maximize the performance gain of the model by marking the fewest samples.
Deep learning (DL) is greedy for data and requires a large amount of data supply to optimize massive parameters.
Deep active learning (DAL) has emerged.
arXiv Detail & Related papers (2020-08-30T04:28:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.