Discussion of Ensemble Learning under the Era of Deep Learning
- URL: http://arxiv.org/abs/2101.08387v1
- Date: Thu, 21 Jan 2021 01:33:23 GMT
- Title: Discussion of Ensemble Learning under the Era of Deep Learning
- Authors: Yongquan Yang, Haijun Lv
- Abstract summary: Ensemble deep learning has shown significant performances in improving the generalization of learning system.
Time and space overheads for training multiple base deep learners and testing with the ensemble deep learner are far greater than that of traditional ensemble learning.
An urgent problem needs to be solved is how to take the significant advantages of ensemble deep learning while reduce the required time and space overheads.
- Score: 4.061135251278187
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to the dominant position of deep learning (mostly deep neural networks)
in various artificial intelligence applications, recently, ensemble learning
based on deep neural networks (ensemble deep learning) has shown significant
performances in improving the generalization of learning system. However, since
modern deep neural networks usually have millions to billions of parameters,
the time and space overheads for training multiple base deep learners and
testing with the ensemble deep learner are far greater than that of traditional
ensemble learning. Though several algorithms of fast ensemble deep learning
have been proposed to promote the deployment of ensemble deep learning in some
applications, further advances still need to be made for many applications in
specific fields, where the developing time and computing resources are usually
restricted or the data to be processed is of large dimensionality. An urgent
problem needs to be solved is how to take the significant advantages of
ensemble deep learning while reduce the required time and space overheads so
that many more applications in specific fields can benefit from it. For the
alleviation of this problem, it is necessary to know about how ensemble
learning has developed under the era of deep learning. Thus, in this article,
we present discussion focusing on data analyses of published works, the
methodology and unattainability of traditional ensemble learning, and recent
developments of ensemble deep learning. We hope this article will be helpful to
realize the technical challenges faced by future developments of ensemble
learning under the era of deep learning.
Related papers
- Deep Internal Learning: Deep Learning from a Single Input [88.59966585422914]
In many cases there is value in training a network just from the input at hand.
This is particularly relevant in many signal and image processing problems where training data is scarce and diversity is large.
This survey paper aims at covering deep internal-learning techniques that have been proposed in the past few years for these two important directions.
arXiv Detail & Related papers (2023-12-12T16:48:53Z) - When Meta-Learning Meets Online and Continual Learning: A Survey [39.53836535326121]
meta-learning is a data-driven approach to optimize the learning algorithm.
Continual learning and online learning, both of which involve incrementally updating a model with streaming data.
This paper organizes various problem settings using consistent terminology and formal descriptions.
arXiv Detail & Related papers (2023-11-09T09:49:50Z) - Knowledge-augmented Deep Learning and Its Applications: A Survey [60.221292040710885]
knowledge-augmented deep learning (KADL) aims to identify domain knowledge and integrate it into deep models for data-efficient, generalizable, and interpretable deep learning.
This survey subsumes existing works and offers a bird's-eye view of research in the general area of knowledge-augmented deep learning.
arXiv Detail & Related papers (2022-11-30T03:44:15Z) - Transferability in Deep Learning: A Survey [80.67296873915176]
The ability to acquire and reuse knowledge is known as transferability in deep learning.
We present this survey to connect different isolated areas in deep learning with their relation to transferability.
We implement a benchmark and an open-source library, enabling a fair evaluation of deep learning methods in terms of transferability.
arXiv Detail & Related papers (2022-01-15T15:03:17Z) - Ten Quick Tips for Deep Learning in Biology [116.78436313026478]
Machine learning is concerned with the development and applications of algorithms that can recognize patterns in data and use them for predictive modeling.
Deep learning has become its own subfield of machine learning.
In the context of biological research, deep learning has been increasingly used to derive novel insights from high-dimensional biological data.
arXiv Detail & Related papers (2021-05-29T21:02:44Z) - A Comprehensive Survey on Community Detection with Deep Learning [93.40332347374712]
A community reveals the features and connections of its members that are different from those in other communities in a network.
This survey devises and proposes a new taxonomy covering different categories of the state-of-the-art methods.
The main category, i.e., deep neural networks, is further divided into convolutional networks, graph attention networks, generative adversarial networks and autoencoders.
arXiv Detail & Related papers (2021-05-26T14:37:07Z) - A neural anisotropic view of underspecification in deep learning [60.119023683371736]
We show that the way neural networks handle the underspecification of problems is highly dependent on the data representation.
Our results highlight that understanding the architectural inductive bias in deep learning is fundamental to address the fairness, robustness, and generalization of these systems.
arXiv Detail & Related papers (2021-04-29T14:31:09Z) - An Overview of Deep Semi-Supervised Learning [8.894935073145252]
There is a rising research interest in semi-supervised learning and its applications to deep neural networks.
This paper provides a comprehensive overview of deep semi-supervised learning, starting with an introduction to the field.
arXiv Detail & Related papers (2020-06-09T14:08:03Z) - Structure preserving deep learning [1.2263454117570958]
deep learning has risen to the foreground as a topic of massive interest.
There are multiple challenging mathematical problems involved in applying deep learning.
A growing effort to mathematically understand the structure in existing deep learning methods.
arXiv Detail & Related papers (2020-06-05T10:59:09Z) - A Survey of Deep Learning for Scientific Discovery [13.372738220280317]
We have seen fundamental breakthroughs in core problems in machine learning, largely driven by advances in deep neural networks.
The amount of data collected in a wide array of scientific domains is dramatically increasing in both size and complexity.
This suggests many exciting opportunities for deep learning applications in scientific settings.
arXiv Detail & Related papers (2020-03-26T06:16:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.