Hierarchical Pruning of Deep Ensembles with Focal Diversity
- URL: http://arxiv.org/abs/2311.10293v1
- Date: Fri, 17 Nov 2023 02:48:20 GMT
- Title: Hierarchical Pruning of Deep Ensembles with Focal Diversity
- Authors: Yanzhao Wu, Ka-Ho Chow, Wenqi Wei, Ling Liu
- Abstract summary: Deep neural network ensembles combine the wisdom of multiple deep neural networks to improve the generalizability and robustness over individual networks.
Some mission-critical applications utilize a large number of deep neural networks to form deep ensembles to achieve desired accuracy and resilience.
This paper presents a novel deep ensemble pruning approach, which can efficiently identify smaller deep ensembles.
- Score: 17.127312781074245
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural network ensembles combine the wisdom of multiple deep neural
networks to improve the generalizability and robustness over individual
networks. It has gained increasing popularity to study deep ensemble techniques
in the deep learning community. Some mission-critical applications utilize a
large number of deep neural networks to form deep ensembles to achieve desired
accuracy and resilience, which introduces high time and space costs for
ensemble execution. However, it still remains a critical challenge whether a
small subset of the entire deep ensemble can achieve the same or better
generalizability and how to effectively identify these small deep ensembles for
improving the space and time efficiency of ensemble execution. This paper
presents a novel deep ensemble pruning approach, which can efficiently identify
smaller deep ensembles and provide higher ensemble accuracy than the entire
deep ensemble of a large number of member networks. Our hierarchical ensemble
pruning approach (HQ) leverages three novel ensemble pruning techniques. First,
we show that the focal diversity metrics can accurately capture the
complementary capacity of the member networks of an ensemble, which can guide
ensemble pruning. Second, we design a focal diversity based hierarchical
pruning approach, which will iteratively find high quality deep ensembles with
low cost and high accuracy. Third, we develop a focal diversity consensus
method to integrate multiple focal diversity metrics to refine ensemble pruning
results, where smaller deep ensembles can be effectively identified to offer
high accuracy, high robustness and high efficiency. Evaluated using popular
benchmark datasets, we demonstrate that the proposed hierarchical ensemble
pruning approach can effectively identify high quality deep ensembles with
better generalizability while being more time and space efficient in ensemble
decision making.
Related papers
- Diversified Ensemble of Independent Sub-Networks for Robust
Self-Supervised Representation Learning [10.784911682565879]
Ensembling a neural network is a widely recognized approach to enhance model performance, estimate uncertainty, and improve robustness in deep supervised learning.
We present a novel self-supervised training regime that leverages an ensemble of independent sub-networks.
Our method efficiently builds a sub-model ensemble with high diversity, leading to well-calibrated estimates of model uncertainty.
arXiv Detail & Related papers (2023-08-28T16:58:44Z) - Layer Ensembles [95.42181254494287]
We introduce a method for uncertainty estimation that considers a set of independent categorical distributions for each layer of the network.
We show that the method can be further improved by ranking samples, resulting in models that require less memory and time to run.
arXiv Detail & Related papers (2022-10-10T17:52:47Z) - Multi-scale Matching Networks for Semantic Correspondence [38.904735120815346]
The proposed method achieves state-of-the-art performance on three popular benchmarks with high computational efficiency.
Our multi-scale matching network can be trained end-to-end easily with few additional learnable parameters.
arXiv Detail & Related papers (2021-07-31T10:57:24Z) - Multi-headed Neural Ensemble Search [68.10888689513583]
Ensembles of CNN models trained with different seeds (also known as Deep Ensembles) are known to achieve superior performance over a single copy of the CNN.
We extend NES to multi-headed ensembles, which consist of a shared backbone attached to multiple prediction heads.
arXiv Detail & Related papers (2021-07-09T11:20:48Z) - Multi-Level Attentive Convoluntional Neural Network for Crowd Counting [12.61997540961144]
We propose a multi-level attentive Convolutional Neural Network (MLAttnCNN) for crowd counting.
We extract high-level contextual information with multiple different scales applied in pooling.
We use multi-level attention modules to enrich the characteristics at different layers to achieve more efficient multi-scale feature fusion.
arXiv Detail & Related papers (2021-05-24T17:29:00Z) - Deep Convolutional Neural Network Ensembles using ECOC [23.29970325359036]
We analyse error correcting output coding (ECOC) framework to be used as an ensemble technique for deep networks.
We propose different design strategies to address the accuracy-complexity trade-off.
arXiv Detail & Related papers (2020-09-07T09:20:24Z) - Recursive Multi-model Complementary Deep Fusion forRobust Salient Object
Detection via Parallel Sub Networks [62.26677215668959]
Fully convolutional networks have shown outstanding performance in the salient object detection (SOD) field.
This paper proposes a wider'' network architecture which consists of parallel sub networks with totally different network architectures.
Experiments on several famous benchmarks clearly demonstrate the superior performance, good generalization, and powerful learning ability of the proposed wider framework.
arXiv Detail & Related papers (2020-08-07T10:39:11Z) - ESPN: Extremely Sparse Pruned Networks [50.436905934791035]
We show that a simple iterative mask discovery method can achieve state-of-the-art compression of very deep networks.
Our algorithm represents a hybrid approach between single shot network pruning methods and Lottery-Ticket type approaches.
arXiv Detail & Related papers (2020-06-28T23:09:27Z) - Neural Ensemble Search for Uncertainty Estimation and Dataset Shift [67.57720300323928]
Ensembles of neural networks achieve superior performance compared to stand-alone networks in terms of accuracy, uncertainty calibration and robustness to dataset shift.
We propose two methods for automatically constructing ensembles with emphvarying architectures.
We show that the resulting ensembles outperform deep ensembles not only in terms of accuracy but also uncertainty calibration and robustness to dataset shift.
arXiv Detail & Related papers (2020-06-15T17:38:15Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.