Leveraging Reinforcement Learning for evaluating Robustness of KNN
Search Algorithms
- URL: http://arxiv.org/abs/2102.06525v1
- Date: Wed, 10 Feb 2021 16:10:58 GMT
- Title: Leveraging Reinforcement Learning for evaluating Robustness of KNN
Search Algorithms
- Authors: Pramod Vadiraja, Christoph Peter Balada
- Abstract summary: The problem of finding K-nearest neighbors in the given dataset for a given query point has been worked upon since several years.
In this paper, we survey some novel K-Nearest Neighbor Search approaches that tackles the problem of Search from the perspectives of computations.
In order to evaluate the robustness of a KNNS approach against adversarial points, we propose a generic Reinforcement Learning based framework for the same.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The problem of finding K-nearest neighbors in the given dataset for a given
query point has been worked upon since several years. In very high dimensional
spaces the K-nearest neighbor search (KNNS) suffers in terms of complexity in
computation of high dimensional distances. With the issue of curse of
dimensionality, it gets quite tedious to reliably bank on the results of
variety approximate nearest neighbor search approaches. In this paper, we
survey some novel K-Nearest Neighbor Search approaches that tackles the problem
of Search from the perspectives of computations, the accuracy of approximated
results and leveraging parallelism to speed-up computations. We attempt to
derive a relationship between the true positive and false points for a given
KNNS approach. Finally, in order to evaluate the robustness of a KNNS approach
against adversarial points, we propose a generic Reinforcement Learning based
framework for the same.
Related papers
- Nearest Neighbour with Bandit Feedback [4.9094025705644695]
Our algorithm handles the fully adversarial setting in which no assumptions at all are made about the data-generation process.
We give generic regret for our algorithm and further analyse them when applied to the bandit problem in euclidean space.
arXiv Detail & Related papers (2023-06-23T20:09:01Z) - Rethinking k-means from manifold learning perspective [122.38667613245151]
We present a new clustering algorithm which directly detects clusters of data without mean estimation.
Specifically, we construct distance matrix between data points by Butterworth filter.
To well exploit the complementary information embedded in different views, we leverage the tensor Schatten p-norm regularization.
arXiv Detail & Related papers (2023-05-12T03:01:41Z) - Efficient k-NN Search with Cross-Encoders using Adaptive Multi-Round CUR
Decomposition [77.4863142882136]
Cross-encoder models are prohibitively expensive for direct k-nearest neighbor (k-NN) search.
We propose ADACUR, a method that adaptively, iteratively, and efficiently minimizes the approximation error for the practically important top-k neighbors.
arXiv Detail & Related papers (2023-05-04T17:01:17Z) - A Theoretical Analysis Of Nearest Neighbor Search On Approximate Near
Neighbor Graph [51.880164098926166]
Graph-based algorithms have demonstrated state-of-the-art performance in the nearest neighbor search (NN-Search) problem.
There exists a practice-to-theory gap in the graph-based NN-Search algorithms.
We present theoretical guarantees of solving NN-Search via greedy search on ANN-Graph for low dimensional and dense vectors.
arXiv Detail & Related papers (2023-03-10T21:18:34Z) - Optimal Extended Neighbourhood Rule $k$ Nearest Neighbours Ensemble [1.8843687952462742]
A new optimal extended neighborhood rule based ensemble method is proposed in this paper.
The ensemble is compared with state-of-the-art methods on 17 benchmark datasets using accuracy, Cohen's kappa, and Brier score (BS)
arXiv Detail & Related papers (2022-11-21T09:13:54Z) - k-Means Maximum Entropy Exploration [55.81894038654918]
Exploration in continuous spaces with sparse rewards is an open problem in reinforcement learning.
We introduce an artificial curiosity algorithm based on lower bounding an approximation to the entropy of the state visitation distribution.
We show that our approach is both computationally efficient and competitive on benchmarks for exploration in high-dimensional, continuous spaces.
arXiv Detail & Related papers (2022-05-31T09:05:58Z) - BR-NS: an Archive-less Approach to Novelty Search [70.13948372218849]
We discuss an alternative approach to novelty estimation, dubbed Behavior Recognition based Novelty Search (BR-NS)
BR-NS does not require an archive, makes no assumption on the metrics that can be defined in the behavior space and does not rely on nearest neighbours search.
We conduct experiments to gain insight into its feasibility and dynamics as well as potential advantages over archive-based NS in terms of time complexity.
arXiv Detail & Related papers (2021-04-08T17:31:34Z) - Nearest Neighbor Search Under Uncertainty [19.225091554227948]
Nearest Neighbor Search (NNS) is a central task in knowledge representation, learning, and reasoning.
This paper studies NNS under Uncertainty (NNSU)
arXiv Detail & Related papers (2021-03-08T20:20:01Z) - Query Complexity of k-NN based Mode Estimation [4.821253146561989]
We study the problem of identifying the point with the minimum k-th nearest neighbor distance for a given dataset of n points.
We design a sequential learning algorithm, based on the idea of confidence intervals, which adaptively decides which queries to send to the oracle and is able to correctly solve the problem with high probability.
arXiv Detail & Related papers (2020-10-26T11:32:08Z) - Non-Local Spatial Propagation Network for Depth Completion [82.60915972250706]
We propose a robust and efficient end-to-end non-local spatial propagation network for depth completion.
The proposed network takes RGB and sparse depth images as inputs and estimates non-local neighbors and their affinities of each pixel.
We show that the proposed algorithm is superior to conventional algorithms in terms of depth completion accuracy and robustness to the mixed-depth problem.
arXiv Detail & Related papers (2020-07-20T12:26:51Z) - A Weighted Mutual k-Nearest Neighbour for Classification Mining [4.538870924201896]
kNN is a very effective Instance based learning method, and it is easy to implement.
In this paper, we propose a new learning algorithm which performs the task of anomaly detection and removal of pseudo neighbours from the dataset.
arXiv Detail & Related papers (2020-05-14T18:11:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.