Adversarial Examples for $k$-Nearest Neighbor Classifiers Based on
Higher-Order Voronoi Diagrams
- URL: http://arxiv.org/abs/2011.09719v2
- Date: Mon, 1 Nov 2021 17:49:08 GMT
- Title: Adversarial Examples for $k$-Nearest Neighbor Classifiers Based on
Higher-Order Voronoi Diagrams
- Authors: Chawin Sitawarin, Evgenios M. Kornaropoulos, Dawn Song, David Wagner
- Abstract summary: Adversarial examples are a widely studied phenomenon in machine learning models.
We propose an algorithm for evaluating the adversarial robustness of $k$-nearest neighbor classification.
- Score: 69.4411417775822
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Adversarial examples are a widely studied phenomenon in machine learning
models. While most of the attention has been focused on neural networks, other
practical models also suffer from this issue. In this work, we propose an
algorithm for evaluating the adversarial robustness of $k$-nearest neighbor
classification, i.e., finding a minimum-norm adversarial example. Diverging
from previous proposals, we take a geometric approach by performing a search
that expands outwards from a given input point. On a high level, the search
radius expands to the nearby Voronoi cells until we find a cell that classifies
differently from the input point. To scale the algorithm to a large $k$, we
introduce approximation steps that find perturbations with smaller norm,
compared to the baselines, in a variety of datasets. Furthermore, we analyze
the structural properties of a dataset where our approach outperforms the
competition.
Related papers
- Adaptive $k$-nearest neighbor classifier based on the local estimation of the shape operator [49.87315310656657]
We introduce a new adaptive $k$-nearest neighbours ($kK$-NN) algorithm that explores the local curvature at a sample to adaptively defining the neighborhood size.
Results on many real-world datasets indicate that the new $kK$-NN algorithm yields superior balanced accuracy compared to the established $k$-NN method.
arXiv Detail & Related papers (2024-09-08T13:08:45Z) - A k nearest neighbours classifiers ensemble based on extended
neighbourhood rule and features subsets [0.4709844746265484]
kNN based ensemble methods minimise the effect of outliers by identifying a set of data points in the given feature space that are nearest to an unseen observation.
This paper proposes a k nearest neighbour ensemble where the neighbours are determined in k steps.
arXiv Detail & Related papers (2022-05-30T13:57:32Z) - Correlation Clustering Reconstruction in Semi-Adversarial Models [70.11015369368272]
Correlation Clustering is an important clustering problem with many applications.
We study the reconstruction version of this problem in which one is seeking to reconstruct a latent clustering corrupted by random noise and adversarial modifications.
arXiv Detail & Related papers (2021-08-10T14:46:17Z) - Instance-Level Relative Saliency Ranking with Graph Reasoning [126.09138829920627]
We present a novel unified model to segment salient instances and infer relative saliency rank order.
A novel loss function is also proposed to effectively train the saliency ranking branch.
experimental results demonstrate that our proposed model is more effective than previous methods.
arXiv Detail & Related papers (2021-07-08T13:10:42Z) - Hyperdimensional Computing for Efficient Distributed Classification with
Randomized Neural Networks [5.942847925681103]
We study distributed classification, which can be employed in situations were data cannot be stored at a central location nor shared.
We propose a more efficient solution for distributed classification by making use of a lossy compression approach applied when sharing the local classifiers with other agents.
arXiv Detail & Related papers (2021-06-02T01:33:56Z) - Distributed Adaptive Nearest Neighbor Classifier: Algorithm and Theory [6.696267547013535]
We propose a novel distributed adaptive NN classifier for which the number of nearest neighbors is a tuning parameterally chosen by a data-driven criterion.
An early stopping rule is proposed when searching for the optimal tuning parameter, which improves the finite sample performance.
In particular, we show that when the sub-sample sizes are sufficiently large, the proposed classifier achieves the nearly optimal convergence rate.
arXiv Detail & Related papers (2021-05-20T14:38:41Z) - Finding Geometric Models by Clustering in the Consensus Space [61.65661010039768]
We propose a new algorithm for finding an unknown number of geometric models, e.g., homographies.
We present a number of applications where the use of multiple geometric models improves accuracy.
These include pose estimation from multiple generalized homographies; trajectory estimation of fast-moving objects.
arXiv Detail & Related papers (2021-03-25T14:35:07Z) - Distributionally Robust Weighted $k$-Nearest Neighbors [21.537952410507483]
Learning a robust classifier from a few samples remains a key challenge in machine learning.
In this paper, we study a minimax distributionally robust formulation of weighted $k$-nearest neighbors.
We develop an algorithm, textttDr.k-NN, that efficiently solves this functional optimization problem.
arXiv Detail & Related papers (2020-06-07T00:34:33Z) - GeoDA: a geometric framework for black-box adversarial attacks [79.52980486689287]
We propose a framework to generate adversarial examples in one of the most challenging black-box settings.
Our framework is based on the observation that the decision boundary of deep networks usually has a small mean curvature in the vicinity of data samples.
arXiv Detail & Related papers (2020-03-13T20:03:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.