An Efficient Method for Sample Adversarial Perturbations against
Nonlinear Support Vector Machines
- URL: http://arxiv.org/abs/2206.05664v1
- Date: Sun, 12 Jun 2022 05:21:51 GMT
- Title: An Efficient Method for Sample Adversarial Perturbations against
Nonlinear Support Vector Machines
- Authors: Wen Su, Qingna Li
- Abstract summary: We investigate the sample adversarial perturbations for nonlinear support vector machines (SVMs)
Due to the implicit form of the nonlinear functions mapping data to the feature space, it is difficult to obtain the explicit form of the adversarial perturbations.
By exploring the special property of nonlinear SVMs, we transform the optimization problem of attacking nonlinear SVMs into a nonlinear KKT system.
- Score: 8.000799046379749
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Adversarial perturbations have drawn great attentions in various machine
learning models. In this paper, we investigate the sample adversarial
perturbations for nonlinear support vector machines (SVMs). Due to the implicit
form of the nonlinear functions mapping data to the feature space, it is
difficult to obtain the explicit form of the adversarial perturbations. By
exploring the special property of nonlinear SVMs, we transform the optimization
problem of attacking nonlinear SVMs into a nonlinear KKT system. Such a system
can be solved by various numerical methods. Numerical results show that our
method is efficient in computing adversarial perturbations.
Related papers
- Koopman-based Deep Learning for Nonlinear System Estimation [1.3791394805787949]
We present a novel data-driven linear estimator that uses Koopman operator theory to extract finite-dimensional representations of complex nonlinear systems.
The extracted model is used together with a deep reinforcement learning network that learns the optimal stepwise actions to predict future states of the original nonlinear system.
arXiv Detail & Related papers (2024-05-01T16:49:54Z) - Beyond PCA: A Probabilistic Gram-Schmidt Approach to Feature Extraction [8.287206589886878]
Linear feature extraction at the presence of nonlinear dependencies among the data is a fundamental challenge in unsupervised learning.
We propose using a probabilistic Gram-Schmidt type orthogonalization process in order to detect and map out redundant dimensions.
We provide simulation results on synthetic and real-world datasets which show improved performance over PCA and state-of-the-art linear feature extraction algorithms.
arXiv Detail & Related papers (2023-11-15T21:29:57Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Training very large scale nonlinear SVMs using Alternating Direction
Method of Multipliers coupled with the Hierarchically Semi-Separable kernel
approximations [0.0]
nonlinear Support Vector Machines (SVMs) produce significantly higher classification quality when compared to linear ones.
Their computational complexity is prohibitive for large-scale datasets.
arXiv Detail & Related papers (2021-08-09T16:52:04Z) - Nonlinear Least Squares for Large-Scale Machine Learning using
Stochastic Jacobian Estimates [0.0]
We exploit the property that the number of model parameters typically exceeds the data in one batch to compute search directions.
We develop two algorithms that estimate Jacobian matrices and perform well when compared to state-of-the-art methods.
arXiv Detail & Related papers (2021-07-12T17:29:08Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Sparse PCA via $l_{2,p}$-Norm Regularization for Unsupervised Feature
Selection [138.97647716793333]
We propose a simple and efficient unsupervised feature selection method, by combining reconstruction error with $l_2,p$-norm regularization.
We present an efficient optimization algorithm to solve the proposed unsupervised model, and analyse the convergence and computational complexity of the algorithm theoretically.
arXiv Detail & Related papers (2020-12-29T04:08:38Z) - Linear embedding of nonlinear dynamical systems and prospects for
efficient quantum algorithms [74.17312533172291]
We describe a method for mapping any finite nonlinear dynamical system to an infinite linear dynamical system (embedding)
We then explore an approach for approximating the resulting infinite linear system with finite linear systems (truncation)
arXiv Detail & Related papers (2020-12-12T00:01:10Z) - Sparse Quantized Spectral Clustering [85.77233010209368]
We exploit tools from random matrix theory to make precise statements about how the eigenspectrum of a matrix changes under such nonlinear transformations.
We show that very little change occurs in the informative eigenstructure even under drastic sparsification/quantization.
arXiv Detail & Related papers (2020-10-03T15:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.