Flexible K Nearest Neighbors Classifier: Derivation and Application for
Ion-mobility Spectrometry-based Indoor Localization
- URL: http://arxiv.org/abs/2304.10151v3
- Date: Wed, 13 Mar 2024 08:14:02 GMT
- Title: Flexible K Nearest Neighbors Classifier: Derivation and Application for
Ion-mobility Spectrometry-based Indoor Localization
- Authors: Philipp M\"uller
- Abstract summary: The K Nearest Neighbors (KNN) is widely used in many fields such as fingerprint-based localization or medicine.
In this paper a KNN-variant is discussed which ensures that the K nearest neighbors are indeed close to the unlabelled sample.
It achieves a higher classification accuracy than the KNN in the tests, while having the same computational demand.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The K Nearest Neighbors (KNN) classifier is widely used in many fields such
as fingerprint-based localization or medicine. It determines the class
membership of unlabelled sample based on the class memberships of the K
labelled samples, the so-called nearest neighbors, that are closest to the
unlabelled sample. The choice of K has been the topic of various studies and
proposed KNN-variants. Yet no variant has been proven to outperform all other
variants. In this paper a KNN-variant is discussed which ensures that the K
nearest neighbors are indeed close to the unlabelled sample and finds K along
the way. The algorithm is tested and compared to the standard KNN in
theoretical scenarios and for indoor localization based on ion-mobility
spectrometry fingerprints. It achieves a higher classification accuracy than
the KNN in the tests, while having the same computational demand.
Related papers
- Adaptive $k$-nearest neighbor classifier based on the local estimation of the shape operator [49.87315310656657]
We introduce a new adaptive $k$-nearest neighbours ($kK$-NN) algorithm that explores the local curvature at a sample to adaptively defining the neighborhood size.
Results on many real-world datasets indicate that the new $kK$-NN algorithm yields superior balanced accuracy compared to the established $k$-NN method.
arXiv Detail & Related papers (2024-09-08T13:08:45Z) - A Novel Pseudo Nearest Neighbor Classification Method Using Local Harmonic Mean Distance [0.0]
This article introduces a novel KNN-based classification method called LMPHNN.
LMPHNN improves classification performance based on LMPNN rules and HMD.
It achieves an average precision of 97%, surpassing other methods by 14%.
arXiv Detail & Related papers (2024-05-10T04:13:07Z) - Information Modified K-Nearest Neighbor [4.916646834691489]
K-Nearest Neighbors (KNN) is the classification of samples based on the majority through their nearest neighbors.
Many KNN methodologies introduce complex algorithms that do not significantly outperform the traditional KNN.
We present a proposed information-modified KNN (IMKNN) to improve the performance of the KNN algorithm.
We conduct experiments on 12 widely-used datasets, achieving 11.05%, 12.42%, and 12.07% in accuracy, precision, and recall performance, respectively.
arXiv Detail & Related papers (2023-12-04T16:10:34Z) - Rethinking k-means from manifold learning perspective [122.38667613245151]
We present a new clustering algorithm which directly detects clusters of data without mean estimation.
Specifically, we construct distance matrix between data points by Butterworth filter.
To well exploit the complementary information embedded in different views, we leverage the tensor Schatten p-norm regularization.
arXiv Detail & Related papers (2023-05-12T03:01:41Z) - Parametric Classification for Generalized Category Discovery: A Baseline
Study [70.73212959385387]
Generalized Category Discovery (GCD) aims to discover novel categories in unlabelled datasets using knowledge learned from labelled samples.
We investigate the failure of parametric classifiers, verify the effectiveness of previous design choices when high-quality supervision is available, and identify unreliable pseudo-labels as a key problem.
We propose a simple yet effective parametric classification method that benefits from entropy regularisation, achieves state-of-the-art performance on multiple GCD benchmarks and shows strong robustness to unknown class numbers.
arXiv Detail & Related papers (2022-11-21T18:47:11Z) - Local Sample-weighted Multiple Kernel Clustering with Consensus
Discriminative Graph [73.68184322526338]
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels.
This paper proposes a novel local sample-weighted multiple kernel clustering model.
Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algo-rithms.
arXiv Detail & Related papers (2022-07-05T05:00:38Z) - A k nearest neighbours classifiers ensemble based on extended
neighbourhood rule and features subsets [0.4709844746265484]
kNN based ensemble methods minimise the effect of outliers by identifying a set of data points in the given feature space that are nearest to an unseen observation.
This paper proposes a k nearest neighbour ensemble where the neighbours are determined in k steps.
arXiv Detail & Related papers (2022-05-30T13:57:32Z) - Dynamic Ensemble Selection Using Fuzzy Hyperboxes [10.269997499911668]
This paper presents a new dynamic ensemble selection (DES) framework based on fuzzy hyperboxes called FH-DES.
Each hyperbox can represent a group of samples using only two data points (Min and Max corners)
For the first time, misclassified samples are used to estimate the competence of the classifiers, which has not been observed in previous fusion approaches.
arXiv Detail & Related papers (2022-05-20T21:06:46Z) - Rethinking Nearest Neighbors for Visual Classification [56.00783095670361]
k-NN is a lazy learning method that aggregates the distance between the test image and top-k neighbors in a training set.
We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps.
Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration.
arXiv Detail & Related papers (2021-12-15T20:15:01Z) - Adaptive Nearest Neighbor Machine Translation [60.97183408140499]
kNN-MT combines pre-trained neural machine translation with token-level k-nearest-neighbor retrieval.
Traditional kNN algorithm simply retrieves a same number of nearest neighbors for each target token.
We propose Adaptive kNN-MT to dynamically determine the number of k for each target token.
arXiv Detail & Related papers (2021-05-27T09:27:42Z) - KNN Classification with One-step Computation [10.381276986079865]
A one-step computation is proposed to replace the lazy part of KNN classification.
The proposed approach is experimentally evaluated, and demonstrated that the one-step KNN classification is efficient and promising.
arXiv Detail & Related papers (2020-12-09T13:34:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.