Nonlinear SVD with Asymmetric Kernels: feature learning and asymmetric
Nystr\"om method
- URL: http://arxiv.org/abs/2306.07040v1
- Date: Mon, 12 Jun 2023 11:39:34 GMT
- Title: Nonlinear SVD with Asymmetric Kernels: feature learning and asymmetric
Nystr\"om method
- Authors: Qinghua Tao, Francesco Tonin, Panagiotis Patrinos, Johan A. K. Suykens
- Abstract summary: Asymmetric data naturally exist in real life, such as directed graphs.
This paper tackles the asymmetric kernel-based learning problem.
Experiments show that asymmetric KSVD learns features outperforming Mercer- Kernel.
- Score: 14.470859959783995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Asymmetric data naturally exist in real life, such as directed graphs.
Different from the common kernel methods requiring Mercer kernels, this paper
tackles the asymmetric kernel-based learning problem. We describe a nonlinear
extension of the matrix Singular Value Decomposition through asymmetric
kernels, namely KSVD. First, we construct two nonlinear feature mappings w.r.t.
rows and columns of the given data matrix. The proposed optimization problem
maximizes the variance of each mapping projected onto the subspace spanned by
the other, subject to a mutual orthogonality constraint. Through Lagrangian
duality, we show that it can be solved by the left and right singular vectors
in the feature space induced by the asymmetric kernel. Moreover, we start from
the integral equations with a pair of adjoint eigenfunctions corresponding to
the singular vectors on an asymmetrical kernel, and extend the Nystr\"om method
to asymmetric cases through the finite sample approximation, which can be
applied to speedup the training in KSVD. Experiments show that asymmetric KSVD
learns features outperforming Mercer-kernel based methods that resort to
symmetrization, and also verify the effectiveness of the asymmetric Nystr\"om
method.
Related papers
- Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nyström method [21.16129116282759]
We introduce a new asymmetric learning paradigm based on coupled covariance eigenproblem (CCE)
We formalize the asymmetric Nystr"om method through a finite sample approximation to speed up training.
arXiv Detail & Related papers (2024-06-13T02:12:18Z) - SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities [14.919246099820548]
Entropic affinities (EAs) are used in the popular Dimensionality Reduction (DR) algorithm t-SNE.
EAs are inherently asymmetric and row-wise, but they are used in DR approaches after undergoing symmetrization methods.
In this work, we uncover a novel characterization of EA as an optimal transport problem, allowing a natural symmetrization that can be computed efficiently.
arXiv Detail & Related papers (2023-05-23T08:08:10Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Learning with Asymmetric Kernels: Least Squares and Feature
Interpretation [28.82444091193872]
Asymmetric kernels naturally exist in real life, e.g., for conditional probability and directed graphs.
This paper addresses the asymmetric kernel-based learning in the framework of the least squares support vector machine named AsK-LS.
We will show that AsK-LS can learn with asymmetric features, namely source and target features, while the kernel trick remains applicable.
arXiv Detail & Related papers (2022-02-03T04:16:20Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - Symmetric and antisymmetric kernels for machine learning problems in
quantum physics and chemistry [0.3441021278275805]
We derive symmetric and antisymmetric kernels by symmetrizing and antisymmetrizing conventional kernels.
We show that by exploiting symmetries or antisymmetries the size of the training data set can be significantly reduced.
arXiv Detail & Related papers (2021-03-31T17:32:27Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Semiparametric Nonlinear Bipartite Graph Representation Learning with
Provable Guarantees [106.91654068632882]
We consider the bipartite graph and formalize its representation learning problem as a statistical estimation problem of parameters in a semiparametric exponential family distribution.
We show that the proposed objective is strongly convex in a neighborhood around the ground truth, so that a gradient descent-based method achieves linear convergence rate.
Our estimator is robust to any model misspecification within the exponential family, which is validated in extensive experiments.
arXiv Detail & Related papers (2020-03-02T16:40:36Z) - Inverse Learning of Symmetries [71.62109774068064]
We learn the symmetry transformation with a model consisting of two latent subspaces.
Our approach is based on the deep information bottleneck in combination with a continuous mutual information regulariser.
Our model outperforms state-of-the-art methods on artificial and molecular datasets.
arXiv Detail & Related papers (2020-02-07T13:48:52Z) - The quantum marginal problem for symmetric states: applications to
variational optimization, nonlocality and self-testing [0.0]
We present a method to solve the quantum marginal problem for symmetric $d$-level systems.
We illustrate the applicability of the method in central quantum information problems with several exemplary case studies.
arXiv Detail & Related papers (2020-01-13T18:20:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.