Dynamic Ensemble Selection Using Fuzzy Hyperboxes
- URL: http://arxiv.org/abs/2205.10438v1
- Date: Fri, 20 May 2022 21:06:46 GMT
- Title: Dynamic Ensemble Selection Using Fuzzy Hyperboxes
- Authors: Reza Davtalab, Rafael M.O. Cruz and Robert Sabourin
- Abstract summary: This paper presents a new dynamic ensemble selection (DES) framework based on fuzzy hyperboxes called FH-DES.
Each hyperbox can represent a group of samples using only two data points (Min and Max corners)
For the first time, misclassified samples are used to estimate the competence of the classifiers, which has not been observed in previous fusion approaches.
- Score: 10.269997499911668
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most dynamic ensemble selection (DES) methods utilize the K-Nearest Neighbors
(KNN) algorithm to estimate the competence of classifiers in a small region
surrounding the query sample. However, KNN is very sensitive to the local
distribution of the data. Moreover, it also has a high computational cost as it
requires storing the whole data in memory and performing multiple distance
calculations during inference. Hence, the dependency on the KNN algorithm ends
up limiting the use of DES techniques for large-scale problems. This paper
presents a new DES framework based on fuzzy hyperboxes called FH-DES. Each
hyperbox can represent a group of samples using only two data points (Min and
Max corners). Thus, the hyperbox-based system will have less computational
complexity than other dynamic selection methods. In addition, despite the
KNN-based approaches, the fuzzy hyperbox is not sensitive to the local data
distribution. Therefore, the local distribution of the samples does not affect
the system's performance. Furthermore, in this research, for the first time,
misclassified samples are used to estimate the competence of the classifiers,
which has not been observed in previous fusion approaches. Experimental results
demonstrate that the proposed method has high classification accuracy while
having a lower complexity when compared with the state-of-the-art dynamic
selection methods. The implemented code is available at
https://github.com/redavtalab/FH-DES_IJCNN.git.
Related papers
- Adaptive $k$-nearest neighbor classifier based on the local estimation of the shape operator [49.87315310656657]
We introduce a new adaptive $k$-nearest neighbours ($kK$-NN) algorithm that explores the local curvature at a sample to adaptively defining the neighborhood size.
Results on many real-world datasets indicate that the new $kK$-NN algorithm yields superior balanced accuracy compared to the established $k$-NN method.
arXiv Detail & Related papers (2024-09-08T13:08:45Z) - MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence [97.93517982908007]
In cross-domain few-shot classification, NCC aims to learn representations to construct a metric space where few-shot classification can be performed.
In this paper, we find that there exist high similarities between NCC-learned representations of two samples from different classes.
We propose a bi-level optimization framework, emphmaximizing optimized kernel dependence (MOKD) to learn a set of class-specific representations that match the cluster structures indicated by labeled data.
arXiv Detail & Related papers (2024-05-29T05:59:52Z) - GAN Based Boundary Aware Classifier for Detecting Out-of-distribution
Samples [24.572516991009323]
We propose a GAN based boundary aware classifier (GBAC) for generating a closed hyperspace which only contains most id data.
Our method is based on the fact that the traditional neural net seperates the feature space as several unclosed regions which are not suitable for ood detection.
With GBAC as an auxiliary module, the ood data distributed outside the closed hyperspace will be assigned with much lower score, allowing more effective ood detection.
arXiv Detail & Related papers (2021-12-22T03:35:54Z) - Cluster Representatives Selection in Non-Metric Spaces for Nearest
Prototype Classification [4.176752121302988]
In this paper, we present CRS, a novel method for selecting a small yet representative subset of objects as a cluster prototype.
Memory and computationally efficient selection of representatives is enabled by leveraging the similarity graph representation of each cluster created by the NN-Descent algorithm.
CRS can be used in an arbitrary metric or non-metric space because of the graph-based approach, which requires only a pairwise similarity measure.
arXiv Detail & Related papers (2021-07-03T04:51:07Z) - Hyperdimensional Computing for Efficient Distributed Classification with
Randomized Neural Networks [5.942847925681103]
We study distributed classification, which can be employed in situations were data cannot be stored at a central location nor shared.
We propose a more efficient solution for distributed classification by making use of a lossy compression approach applied when sharing the local classifiers with other agents.
arXiv Detail & Related papers (2021-06-02T01:33:56Z) - Coded Stochastic ADMM for Decentralized Consensus Optimization with Edge
Computing [113.52575069030192]
Big data, including applications with high security requirements, are often collected and stored on multiple heterogeneous devices, such as mobile devices, drones and vehicles.
Due to the limitations of communication costs and security requirements, it is of paramount importance to extract information in a decentralized manner instead of aggregating data to a fusion center.
We consider the problem of learning model parameters in a multi-agent system with data locally processed via distributed edge nodes.
A class of mini-batch alternating direction method of multipliers (ADMM) algorithms is explored to develop the distributed learning model.
arXiv Detail & Related papers (2020-10-02T10:41:59Z) - Distributionally Robust Weighted $k$-Nearest Neighbors [21.537952410507483]
Learning a robust classifier from a few samples remains a key challenge in machine learning.
In this paper, we study a minimax distributionally robust formulation of weighted $k$-nearest neighbors.
We develop an algorithm, textttDr.k-NN, that efficiently solves this functional optimization problem.
arXiv Detail & Related papers (2020-06-07T00:34:33Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z) - Computationally efficient sparse clustering [67.95910835079825]
We provide a finite sample analysis of a new clustering algorithm based on PCA.
We show that it achieves the minimax optimal misclustering rate in the regime $|theta infty$.
arXiv Detail & Related papers (2020-05-21T17:51:30Z) - OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax
Layer [77.90012156266324]
This paper aims to find a subspace of neural networks that can facilitate a large decision margin.
We propose the Orthogonal Softmax Layer (OSL), which makes the weight vectors in the classification layer remain during both the training and test processes.
Experimental results demonstrate that the proposed OSL has better performance than the methods used for comparison on four small-sample benchmark datasets.
arXiv Detail & Related papers (2020-04-20T02:41:01Z) - A new hashing based nearest neighbors selection technique for big
datasets [14.962398031252063]
This paper proposes a new technique that enables the selection of nearest neighbors directly in the neighborhood of a given observation.
The proposed approach consists of dividing the data space into subcells of a virtual grid built on top of data space.
Our algorithm outperforms the original KNN in time efficiency with a prediction quality as good as that of KNN.
arXiv Detail & Related papers (2020-04-05T19:36:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.