Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI
- URL: http://arxiv.org/abs/2402.00809v5
- Date: Tue, 6 Aug 2024 16:32:38 GMT
- Title: Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI
- Authors: Theodore Papamarkou, Maria Skoularidou, Konstantina Palla, Laurence Aitchison, Julyan Arbel, David Dunson, Maurizio Filippone, Vincent Fortuin, Philipp Hennig, José Miguel Hernández-Lobato, Aliaksandr Hubin, Alexander Immer, Theofanis Karaletsos, Mohammad Emtiyaz Khan, Agustinus Kristiadi, Yingzhen Li, Stephan Mandt, Christopher Nemeth, Michael A. Osborne, Tim G. J. Rudner, David Rügamer, Yee Whye Teh, Max Welling, Andrew Gordon Wilson, Ruqi Zhang,
- Abstract summary: This paper revisits the strengths of Bayesian deep learning (BDL) and acknowledges existing challenges.
It highlights some exciting research avenues aimed at addressing these obstacles.
Looking ahead, the discussion focuses on possible ways to combine large-scale foundation models with BDL to unlock their full potential.
- Score: 189.05642691423347
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In the current landscape of deep learning research, there is a predominant emphasis on achieving high predictive accuracy in supervised tasks involving large image and language datasets. However, a broader perspective reveals a multitude of overlooked metrics, tasks, and data types, such as uncertainty, active and continual learning, and scientific data, that demand attention. Bayesian deep learning (BDL) constitutes a promising avenue, offering advantages across these diverse settings. This paper posits that BDL can elevate the capabilities of deep learning. It revisits the strengths of BDL, acknowledges existing challenges, and highlights some exciting research avenues aimed at addressing these obstacles. Looking ahead, the discussion focuses on possible ways to combine large-scale foundation models with BDL to unlock their full potential.
Related papers
- A Comprehensive Survey on Evidential Deep Learning and Its Applications [64.83473301188138]
Evidential Deep Learning (EDL) provides reliable uncertainty estimation with minimal additional computation in a single forward pass.
We first delve into the theoretical foundation of EDL, the subjective logic theory, and discuss its distinctions from other uncertainty estimation frameworks.
We elaborate on its extensive applications across various machine learning paradigms and downstream tasks.
arXiv Detail & Related papers (2024-09-07T05:55:06Z) - Deep Learning Meets OBIA: Tasks, Challenges, Strategies, and Perspectives [8.11184750121407]
Deep learning has gained significant attention in remote sensing, especially in pixel- or patch-level applications.
Despite initial attempts to integrate deep learning into object-based image analysis (OBIA), its full potential remains largely unexplored.
arXiv Detail & Related papers (2024-08-02T23:54:02Z) - Unveiling Entity-Level Unlearning for Large Language Models: A Comprehensive Analysis [32.455702022397666]
Large language model unlearning has garnered increasing attention due to its potential to address security and privacy concerns.
Much of this research has concentrated on instance-level unlearning, specifically targeting the removal of predefined instances containing sensitive content.
We propose a novel task of entity-level unlearning, which aims to erase entity-related knowledge from the target model completely.
arXiv Detail & Related papers (2024-06-22T09:40:07Z) - Evaluation and Enhancement of Semantic Grounding in Large
Vision-Language Models [25.413601452403213]
Large Vision-Language Models (LVLMs) offer remarkable benefits for a variety of vision-language tasks.
Their constrained semantic grounding ability hinders their application in real-world scenarios.
We propose a data-centric enhancement method that aims to improve LVLMs' semantic grounding ability.
arXiv Detail & Related papers (2023-09-07T22:59:56Z) - A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual
Learning [76.47138162283714]
Forgetting refers to the loss or deterioration of previously acquired information or knowledge.
Forgetting is a prevalent phenomenon observed in various other research domains within deep learning.
Survey argues that forgetting is a double-edged sword and can be beneficial and desirable in certain cases.
arXiv Detail & Related papers (2023-07-16T16:27:58Z) - Deep Long-Tailed Learning: A Survey [163.16874896812885]
Deep long-tailed learning aims to train well-performing deep models from a large number of images that follow a long-tailed class distribution.
Long-tailed class imbalance is a common problem in practical visual recognition tasks.
This paper provides a comprehensive survey on recent advances in deep long-tailed learning.
arXiv Detail & Related papers (2021-10-09T15:25:22Z) - Personalized Education in the AI Era: What to Expect Next? [76.37000521334585]
The objective of personalized learning is to design an effective knowledge acquisition track that matches the learner's strengths and bypasses her weaknesses to meet her desired goal.
In recent years, the boost of artificial intelligence (AI) and machine learning (ML) has unfolded novel perspectives to enhance personalized education.
arXiv Detail & Related papers (2021-01-19T12:23:32Z) - Deep Learning for Road Traffic Forecasting: Does it Make a Difference? [6.220008946076208]
This paper focuses on critically analyzing the state of the art in what refers to the use of Deep Learning for this particular ITS research area.
A posterior critical analysis is held to formulate questions and trigger a necessary debate about the issues of Deep Learning for traffic forecasting.
arXiv Detail & Related papers (2020-12-02T15:56:11Z) - A Survey of Deep Active Learning [54.376820959917005]
Active learning (AL) attempts to maximize the performance gain of the model by marking the fewest samples.
Deep learning (DL) is greedy for data and requires a large amount of data supply to optimize massive parameters.
Deep active learning (DAL) has emerged.
arXiv Detail & Related papers (2020-08-30T04:28:31Z) - An Overview of Deep Semi-Supervised Learning [8.894935073145252]
There is a rising research interest in semi-supervised learning and its applications to deep neural networks.
This paper provides a comprehensive overview of deep semi-supervised learning, starting with an introduction to the field.
arXiv Detail & Related papers (2020-06-09T14:08:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.