Learning with Asymmetric Kernels: Least Squares and Feature
Interpretation
- URL: http://arxiv.org/abs/2202.01397v1
- Date: Thu, 3 Feb 2022 04:16:20 GMT
- Title: Learning with Asymmetric Kernels: Least Squares and Feature
Interpretation
- Authors: Mingzhen He, Fan He, Lei Shi, Xiaolin Huang and Johan A.K. Suykens
- Abstract summary: Asymmetric kernels naturally exist in real life, e.g., for conditional probability and directed graphs.
This paper addresses the asymmetric kernel-based learning in the framework of the least squares support vector machine named AsK-LS.
We will show that AsK-LS can learn with asymmetric features, namely source and target features, while the kernel trick remains applicable.
- Score: 28.82444091193872
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Asymmetric kernels naturally exist in real life, e.g., for conditional
probability and directed graphs. However, most of the existing kernel-based
learning methods require kernels to be symmetric, which prevents the use of
asymmetric kernels. This paper addresses the asymmetric kernel-based learning
in the framework of the least squares support vector machine named AsK-LS,
resulting in the first classification method that can utilize asymmetric
kernels directly. We will show that AsK-LS can learn with asymmetric features,
namely source and target features, while the kernel trick remains applicable,
i.e., the source and target features exist but are not necessarily known.
Besides, the computational burden of AsK-LS is as cheap as dealing with
symmetric kernels. Experimental results on the Corel database, directed graphs,
and the UCI database will show that in the case asymmetric information is
crucial, the proposed AsK-LS can learn with asymmetric kernels and performs
much better than the existing kernel methods that have to do symmetrization to
accommodate asymmetric kernels.
Related papers
- Geometric Learning with Positively Decomposable Kernels [6.5497574505866885]
We propose the use of reproducing kernel Krein space (RKKS) based methods, which require only kernels that admit a positive decomposition.
We show that one does not need to access this decomposition in order to learn in RKKS.
arXiv Detail & Related papers (2023-10-20T21:18:04Z) - Nonlinear SVD with Asymmetric Kernels: feature learning and asymmetric
Nystr\"om method [14.470859959783995]
Asymmetric data naturally exist in real life, such as directed graphs.
This paper tackles the asymmetric kernel-based learning problem.
Experiments show that asymmetric KSVD learns features outperforming Mercer- Kernel.
arXiv Detail & Related papers (2023-06-12T11:39:34Z) - Random Fourier Features for Asymmetric Kernels [24.20121243104385]
We introduce a complex measure with the real and imaginary parts corresponding to four finite positive measures, which expands the application scope of the Bochner theorem.
This framework allows for handling classical symmetric, PD kernels via one positive measure; symmetric, non-positive definite kernels via signed measures; and asymmetric kernels via complex measures.
Our AsK-RFFs method is empirically validated on several typical large-scale datasets and achieves promising kernel approximation performance.
arXiv Detail & Related papers (2022-09-18T03:39:18Z) - Local Sample-weighted Multiple Kernel Clustering with Consensus
Discriminative Graph [73.68184322526338]
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels.
This paper proposes a novel local sample-weighted multiple kernel clustering model.
Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algo-rithms.
arXiv Detail & Related papers (2022-07-05T05:00:38Z) - SOCKS: A Stochastic Optimal Control and Reachability Toolbox Using
Kernel Methods [0.0]
SOCKS is a data-driven optimal control toolbox based in kernel methods.
We present the main features of SOCKS and demonstrate its capabilities on several benchmarks.
arXiv Detail & Related papers (2022-03-12T00:09:08Z) - Meta-Learning Hypothesis Spaces for Sequential Decision-making [79.73213540203389]
We propose to meta-learn a kernel from offline data (Meta-KeL)
Under mild conditions, we guarantee that our estimated RKHS yields valid confidence sets.
We also empirically evaluate the effectiveness of our approach on a Bayesian optimization task.
arXiv Detail & Related papers (2022-02-01T17:46:51Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Taming Nonconvexity in Kernel Feature Selection---Favorable Properties
of the Laplace Kernel [77.73399781313893]
A challenge is to establish the objective function of kernel-based feature selection.
The gradient-based algorithms available for non-global optimization are only able to guarantee convergence to local minima.
arXiv Detail & Related papers (2021-06-17T11:05:48Z) - Symmetric and antisymmetric kernels for machine learning problems in
quantum physics and chemistry [0.3441021278275805]
We derive symmetric and antisymmetric kernels by symmetrizing and antisymmetrizing conventional kernels.
We show that by exploiting symmetries or antisymmetries the size of the training data set can be significantly reduced.
arXiv Detail & Related papers (2021-03-31T17:32:27Z) - Flow-based Kernel Prior with Application to Blind Super-Resolution [143.21527713002354]
Kernel estimation is generally one of the key problems for blind image super-resolution (SR)
This paper proposes a normalizing flow-based kernel prior (FKP) for kernel modeling.
Experiments on synthetic and real-world images demonstrate that the proposed FKP can significantly improve the kernel estimation accuracy.
arXiv Detail & Related papers (2021-03-29T22:37:06Z) - Isolation Distributional Kernel: A New Tool for Point & Group Anomaly
Detection [76.1522587605852]
Isolation Distributional Kernel (IDK) is a new way to measure the similarity between two distributions.
We demonstrate IDK's efficacy and efficiency as a new tool for kernel based anomaly detection for both point and group anomalies.
arXiv Detail & Related papers (2020-09-24T12:25:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.