Sparse Quantized Spectral Clustering
- URL: http://arxiv.org/abs/2010.01376v1
- Date: Sat, 3 Oct 2020 15:58:07 GMT
- Title: Sparse Quantized Spectral Clustering
- Authors: Zhenyu Liao, Romain Couillet, Michael W. Mahoney
- Abstract summary: We exploit tools from random matrix theory to make precise statements about how the eigenspectrum of a matrix changes under such nonlinear transformations.
We show that very little change occurs in the informative eigenstructure even under drastic sparsification/quantization.
- Score: 85.77233010209368
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Given a large data matrix, sparsifying, quantizing, and/or performing other
entry-wise nonlinear operations can have numerous benefits, ranging from
speeding up iterative algorithms for core numerical linear algebra problems to
providing nonlinear filters to design state-of-the-art neural network models.
Here, we exploit tools from random matrix theory to make precise statements
about how the eigenspectrum of a matrix changes under such nonlinear
transformations. In particular, we show that very little change occurs in the
informative eigenstructure even under drastic sparsification/quantization, and
consequently that very little downstream performance loss occurs with very
aggressively sparsified or quantized spectral clustering. We illustrate how
these results depend on the nonlinearity, we characterize a phase transition
beyond which spectral clustering becomes possible, and we show when such
nonlinear transformations can introduce spurious non-informative eigenvectors.
Related papers
- An Efficient Method for Sample Adversarial Perturbations against
Nonlinear Support Vector Machines [8.000799046379749]
We investigate the sample adversarial perturbations for nonlinear support vector machines (SVMs)
Due to the implicit form of the nonlinear functions mapping data to the feature space, it is difficult to obtain the explicit form of the adversarial perturbations.
By exploring the special property of nonlinear SVMs, we transform the optimization problem of attacking nonlinear SVMs into a nonlinear KKT system.
arXiv Detail & Related papers (2022-06-12T05:21:51Z) - Exploring Linear Feature Disentanglement For Neural Networks [63.20827189693117]
Non-linear activation functions, e.g., Sigmoid, ReLU, and Tanh, have achieved great success in neural networks (NNs)
Due to the complex non-linear characteristic of samples, the objective of those activation functions is to project samples from their original feature space to a linear separable feature space.
This phenomenon ignites our interest in exploring whether all features need to be transformed by all non-linear functions in current typical NNs.
arXiv Detail & Related papers (2022-03-22T13:09:17Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - Learning Nonlinear Waves in Plasmon-induced Transparency [0.0]
We consider a recurrent neural network (RNN) approach to predict the complex propagation of nonlinear solitons in plasmon-induced transparency metamaterial systems.
We prove the prominent agreement of results in simulation and prediction by long short-term memory (LSTM) artificial neural networks.
arXiv Detail & Related papers (2021-07-31T21:21:44Z) - Designing Kerr Interactions for Quantum Information Processing via
Counterrotating Terms of Asymmetric Josephson-Junction Loops [68.8204255655161]
static cavity nonlinearities typically limit the performance of bosonic quantum error-correcting codes.
Treating the nonlinearity as a perturbation, we derive effective Hamiltonians using the Schrieffer-Wolff transformation.
Results show that a cubic interaction allows to increase the effective rates of both linear and nonlinear operations.
arXiv Detail & Related papers (2021-07-14T15:11:05Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Learning Fast Approximations of Sparse Nonlinear Regression [50.00693981886832]
In this work, we bridge the gap by introducing the Threshold Learned Iterative Shrinkage Algorithming (NLISTA)
Experiments on synthetic data corroborate our theoretical results and show our method outperforms state-of-the-art methods.
arXiv Detail & Related papers (2020-10-26T11:31:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.