Off-the-shelf deep learning is not enough: parsimony, Bayes and
causality
- URL: http://arxiv.org/abs/2005.01557v1
- Date: Mon, 4 May 2020 15:16:30 GMT
- Title: Off-the-shelf deep learning is not enough: parsimony, Bayes and
causality
- Authors: Rama K. Vasudevan, Maxim Ziatdinov, Lukas Vlcek, Sergei V. Kalinin
- Abstract summary: We discuss opportunities and roadblocks to implementation of deep learning within materials science.
We argue that deep learning and AI are now well positioned to revolutionize fields where causal links are known.
- Score: 0.8602553195689513
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks ("deep learning") have emerged as a technology of choice
to tackle problems in natural language processing, computer vision, speech
recognition and gameplay, and in just a few years has led to superhuman level
performance and ushered in a new wave of "AI." Buoyed by these successes,
researchers in the physical sciences have made steady progress in incorporating
deep learning into their respective domains. However, such adoption brings
substantial challenges that need to be recognized and confronted. Here, we
discuss both opportunities and roadblocks to implementation of deep learning
within materials science, focusing on the relationship between correlative
nature of machine learning and causal hypothesis driven nature of physical
sciences. We argue that deep learning and AI are now well positioned to
revolutionize fields where causal links are known, as is the case for
applications in theory. When confounding factors are frozen or change only
weakly, this leaves open the pathway for effective deep learning solutions in
experimental domains. Similarly, these methods offer a pathway towards
understanding the physics of real-world systems, either via deriving reduced
representations, deducing algorithmic complexity, or recovering generative
physical models. However, extending deep learning and "AI" for models with
unclear causal relationship can produce misleading and potentially incorrect
results. Here, we argue the broad adoption of Bayesian methods incorporating
prior knowledge, development of DL solutions with incorporated physical
constraints, and ultimately adoption of causal models, offers a path forward
for fundamental and applied research. Most notably, while these advances can
change the way science is carried out in ways we cannot imagine, machine
learning is not going to substitute science any time soon.
Related papers
- A Survey on State-of-the-art Deep Learning Applications and Challenges [0.0]
Building a deep learning model is challenging due to the algorithm's complexity and the dynamic nature of real-world problems.
This study aims to comprehensively review the state-of-the-art deep learning models in computer vision, natural language processing, time series analysis and pervasive computing.
arXiv Detail & Related papers (2024-03-26T10:10:53Z) - A Review of Neuroscience-Inspired Machine Learning [58.72729525961739]
Bio-plausible credit assignment is compatible with practically any learning condition and is energy-efficient.
In this paper, we survey several vital algorithms that model bio-plausible rules of credit assignment in artificial neural networks.
We conclude by discussing the future challenges that will need to be addressed in order to make such algorithms more useful in practical applications.
arXiv Detail & Related papers (2024-02-16T18:05:09Z) - Brain-Inspired Computational Intelligence via Predictive Coding [89.6335791546526]
Predictive coding (PC) has shown promising performance in machine intelligence tasks.
PC can model information processing in different brain areas, can be used in cognitive control and robotics.
arXiv Detail & Related papers (2023-08-15T16:37:16Z) - Artificial Intelligence for Science in Quantum, Atomistic, and Continuum Systems [268.585904751315]
New area of research known as AI for science (AI4Science)
Areas aim at understanding the physical world from subatomic (wavefunctions and electron density), atomic (molecules, proteins, materials, and interactions), to macro (fluids, climate, and subsurface) scales.
Key common challenge is how to capture physics first principles, especially symmetries, in natural systems by deep learning methods.
arXiv Detail & Related papers (2023-07-17T12:14:14Z) - The Future of Fundamental Science Led by Generative Closed-Loop
Artificial Intelligence [67.70415658080121]
Recent advances in machine learning and AI are disrupting technological innovation, product development, and society as a whole.
AI has contributed less to fundamental science in part because large data sets of high-quality data for scientific practice and model discovery are more difficult to access.
Here we explore and investigate aspects of an AI-driven, automated, closed-loop approach to scientific discovery.
arXiv Detail & Related papers (2023-07-09T21:16:56Z) - Deep Causal Learning: Representation, Discovery and Inference [2.696435860368848]
Causal learning reveals the essential relationships that underpin phenomena and delineates the mechanisms by which the world evolves.
Traditional causal learning methods face numerous challenges and limitations, including high-dimensional variables, unstructured variables, optimization problems, unobserved confounders, selection biases, and estimation inaccuracies.
Deep causal learning, which leverages deep neural networks, offers innovative insights and solutions for addressing these challenges.
arXiv Detail & Related papers (2022-11-07T09:00:33Z) - Continual Learning with Deep Learning Methods in an Application-Oriented
Context [0.0]
An important research area of Artificial Intelligence (AI) deals with the automatic derivation of knowledge from data.
One type of machine learning algorithms that can be categorized as "deep learning" model is referred to as Deep Neural Networks (DNNs)
DNNs are affected by a problem that prevents new knowledge from being added to an existing base.
arXiv Detail & Related papers (2022-07-12T10:13:33Z) - Learning from learning machines: a new generation of AI technology to
meet the needs of science [59.261050918992325]
We outline emerging opportunities and challenges to enhance the utility of AI for scientific discovery.
The distinct goals of AI for industry versus the goals of AI for science create tension between identifying patterns in data versus discovering patterns in the world from data.
arXiv Detail & Related papers (2021-11-27T00:55:21Z) - Knowledge as Invariance -- History and Perspectives of
Knowledge-augmented Machine Learning [69.99522650448213]
Research in machine learning is at a turning point.
Research interests are shifting away from increasing the performance of highly parameterized models to exceedingly specific tasks.
This white paper provides an introduction and discussion of this emerging field in machine learning research.
arXiv Detail & Related papers (2020-12-21T15:07:19Z) - Shortcut Learning in Deep Neural Networks [29.088631285225237]
We seek to distill how many of deep learning's problems can be seen as different symptoms of the same underlying problem: shortcut learning.
Shortcuts are decision rules that perform well on standard benchmarks but fail to transfer to more challenging testing conditions, such as real-world scenarios.
We develop recommendations for model interpretation and benchmarking, highlighting recent advances in machine learning to improve robustness and transferability from the lab to real-world applications.
arXiv Detail & Related papers (2020-04-16T17:18:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.