k-NNN: Nearest Neighbors of Neighbors for Anomaly Detection
- URL: http://arxiv.org/abs/2305.17695v1
- Date: Sun, 28 May 2023 11:39:51 GMT
- Title: k-NNN: Nearest Neighbors of Neighbors for Anomaly Detection
- Authors: Ori Nizan, Ayellet Tal
- Abstract summary: Anomaly detection aims at identifying images that deviate significantly from the norm.
We propose a new operator that takes into account the varying structure & importance of the features in the embedding space.
We show that by simply replacing the nearest neighbor component in existing algorithms by our k-NNN operator, while leaving the rest of the algorithms untouched, each algorithms own results are improved.
- Score: 20.204147875108976
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Anomaly detection aims at identifying images that deviate significantly from
the norm. We focus on algorithms that embed the normal training examples in
space and when given a test image, detect anomalies based on the features
distance to the k-nearest training neighbors. We propose a new operator that
takes into account the varying structure & importance of the features in the
embedding space. Interestingly, this is done by taking into account not only
the nearest neighbors, but also the neighbors of these neighbors (k-NNN). We
show that by simply replacing the nearest neighbor component in existing
algorithms by our k-NNN operator, while leaving the rest of the algorithms
untouched, each algorithms own results are improved. This is the case both for
common homogeneous datasets, such as flowers or nuts of a specific type, as
well as for more diverse datasets
Related papers
- Adaptive $k$-nearest neighbor classifier based on the local estimation of the shape operator [49.87315310656657]
We introduce a new adaptive $k$-nearest neighbours ($kK$-NN) algorithm that explores the local curvature at a sample to adaptively defining the neighborhood size.
Results on many real-world datasets indicate that the new $kK$-NN algorithm yields superior balanced accuracy compared to the established $k$-NN method.
arXiv Detail & Related papers (2024-09-08T13:08:45Z) - Improving Novelty Detection using the Reconstructions of Nearest
Neighbours [0.0]
We show that using nearest neighbours in the latent space of autoencoders (AE) significantly improves performance of semi-supervised novelty detection.
Our method harnesses a combination of the reconstructions of the nearest neighbours and the latent-neighbour distances of a given input's latent representation.
arXiv Detail & Related papers (2021-11-11T11:09:44Z) - Adaptive Nearest Neighbor Machine Translation [60.97183408140499]
kNN-MT combines pre-trained neural machine translation with token-level k-nearest-neighbor retrieval.
Traditional kNN algorithm simply retrieves a same number of nearest neighbors for each target token.
We propose Adaptive kNN-MT to dynamically determine the number of k for each target token.
arXiv Detail & Related papers (2021-05-27T09:27:42Z) - Nearest Neighbor Search Under Uncertainty [19.225091554227948]
Nearest Neighbor Search (NNS) is a central task in knowledge representation, learning, and reasoning.
This paper studies NNS under Uncertainty (NNSU)
arXiv Detail & Related papers (2021-03-08T20:20:01Z) - Cross-Domain Generalization Through Memorization: A Study of Nearest
Neighbors in Neural Duplicate Question Detection [72.01292864036087]
Duplicate question detection (DQD) is important to increase efficiency of community and automatic question answering systems.
We leverage neural representations and study nearest neighbors for cross-domain generalization in DQD.
We observe robust performance of this method in different cross-domain scenarios of StackExchange, Spring and Quora datasets.
arXiv Detail & Related papers (2020-11-22T19:19:33Z) - Adversarial Examples for $k$-Nearest Neighbor Classifiers Based on
Higher-Order Voronoi Diagrams [69.4411417775822]
Adversarial examples are a widely studied phenomenon in machine learning models.
We propose an algorithm for evaluating the adversarial robustness of $k$-nearest neighbor classification.
arXiv Detail & Related papers (2020-11-19T08:49:10Z) - Differentially Private Clustering: Tight Approximation Ratios [57.89473217052714]
We give efficient differentially private algorithms for basic clustering problems.
Our results imply an improved algorithm for the Sample and Aggregate privacy framework.
One of the tools used in our 1-Cluster algorithm can be employed to get a faster quantum algorithm for ClosestPair in a moderate number of dimensions.
arXiv Detail & Related papers (2020-08-18T16:22:06Z) - A Weighted Mutual k-Nearest Neighbour for Classification Mining [4.538870924201896]
kNN is a very effective Instance based learning method, and it is easy to implement.
In this paper, we propose a new learning algorithm which performs the task of anomaly detection and removal of pseudo neighbours from the dataset.
arXiv Detail & Related papers (2020-05-14T18:11:30Z) - Neighborhood Matching Network for Entity Alignment [71.24217694278616]
Neighborhood Matching Network (NMN) is a novel entity alignment framework.
NMN estimates the similarities between entities to capture both the topological structure and the neighborhood difference.
It first uses a novel graph sampling method to distill a discriminative neighborhood for each entity.
It then adopts a cross-graph neighborhood matching module to jointly encode the neighborhood difference for a given entity pair.
arXiv Detail & Related papers (2020-05-12T08:26:15Z) - A new hashing based nearest neighbors selection technique for big
datasets [14.962398031252063]
This paper proposes a new technique that enables the selection of nearest neighbors directly in the neighborhood of a given observation.
The proposed approach consists of dividing the data space into subcells of a virtual grid built on top of data space.
Our algorithm outperforms the original KNN in time efficiency with a prediction quality as good as that of KNN.
arXiv Detail & Related papers (2020-04-05T19:36:00Z) - Neighborhood and Graph Constructions using Non-Negative Kernel
Regression [42.16401154367232]
We present an alternative view of neighborhood selection, where we show that neighborhood construction is equivalent to a sparse signal approximation problem.
We also propose an algorithm, non-negative kernel regression(NNK), for obtaining neighborhoods that lead to better sparse representation.
arXiv Detail & Related papers (2019-10-21T13:58:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.