Manifold Adaptive Multiple Kernel K-Means for Clustering
- URL: http://arxiv.org/abs/2009.14389v1
- Date: Wed, 30 Sep 2020 02:07:53 GMT
- Title: Manifold Adaptive Multiple Kernel K-Means for Clustering
- Authors: Liang Du, Haiying Zhang, Xin Ren, Xiaolin Lv
- Abstract summary: We adopt the manifold adaptive kernel, instead of the original kernel, to integrate the local manifold structure of kernels.
It has been verified that the proposed method outperforms several state-of-the-art baseline methods on a variety of data sets.
- Score: 5.8671688602326215
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multiple kernel methods based on k-means aims to integrate a group of kernels
to improve the performance of kernel k-means clustering. However, we observe
that most existing multiple kernel k-means methods exploit the nonlinear
relationship within kernels, whereas the local manifold structure among
multiple kernel space is not sufficiently considered. In this paper, we adopt
the manifold adaptive kernel, instead of the original kernel, to integrate the
local manifold structure of kernels. Thus, the induced multiple manifold
adaptive kernels not only reflect the nonlinear relationship but also the local
manifold structure. We then perform multiple kernel clustering within the
multiple kernel k-means clustering framework. It has been verified that the
proposed method outperforms several state-of-the-art baseline methods on a
variety of data sets.
Related papers
- Multiple Kernel Clustering via Local Regression Integration [4.856913393644719]
Multiple kernel methods less consider the intrinsic manifold structure of multiple kernel data.
This paper first presents the clustering method via kernelized local regression (CKLR)
We then extend it to perform clustering via the multiple kernel local regression (CMKLR)
arXiv Detail & Related papers (2024-10-20T06:26:29Z) - MOKD: Cross-domain Finetuning for Few-shot Classification via Maximizing Optimized Kernel Dependence [97.93517982908007]
In cross-domain few-shot classification, NCC aims to learn representations to construct a metric space where few-shot classification can be performed.
In this paper, we find that there exist high similarities between NCC-learned representations of two samples from different classes.
We propose a bi-level optimization framework, emphmaximizing optimized kernel dependence (MOKD) to learn a set of class-specific representations that match the cluster structures indicated by labeled data.
arXiv Detail & Related papers (2024-05-29T05:59:52Z) - Local Sample-weighted Multiple Kernel Clustering with Consensus
Discriminative Graph [73.68184322526338]
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels.
This paper proposes a novel local sample-weighted multiple kernel clustering model.
Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algo-rithms.
arXiv Detail & Related papers (2022-07-05T05:00:38Z) - Recovery Guarantees for Kernel-based Clustering under Non-parametric
Mixture Models [26.847612684502998]
We study the statistical performance of kernel-based clustering algorithms under non-parametric mixture models.
We establish a key equivalence between kernel-based data-clustering and kernel density-based clustering.
arXiv Detail & Related papers (2021-10-18T17:23:54Z) - Kernel Mean Estimation by Marginalized Corrupted Distributions [96.9272743070371]
Estimating the kernel mean in a kernel Hilbert space is a critical component in many kernel learning algorithms.
We present a new kernel mean estimator, called the marginalized kernel mean estimator, which estimates kernel mean under the corrupted distribution.
arXiv Detail & Related papers (2021-07-10T15:11:28Z) - Flow-based Kernel Prior with Application to Blind Super-Resolution [143.21527713002354]
Kernel estimation is generally one of the key problems for blind image super-resolution (SR)
This paper proposes a normalizing flow-based kernel prior (FKP) for kernel modeling.
Experiments on synthetic and real-world images demonstrate that the proposed FKP can significantly improve the kernel estimation accuracy.
arXiv Detail & Related papers (2021-03-29T22:37:06Z) - Isolation Distributional Kernel: A New Tool for Point & Group Anomaly
Detection [76.1522587605852]
Isolation Distributional Kernel (IDK) is a new way to measure the similarity between two distributions.
We demonstrate IDK's efficacy and efficiency as a new tool for kernel based anomaly detection for both point and group anomalies.
arXiv Detail & Related papers (2020-09-24T12:25:43Z) - SimpleMKKM: Simple Multiple Kernel K-means [49.500663154085586]
We propose a simple yet effective multiple kernel clustering algorithm, termed simple multiple kernel k-means (SimpleMKKM)
Our criterion is given by an intractable minimization-maximization problem in the kernel coefficient and clustering partition matrix.
We theoretically analyze the performance of SimpleMKKM in terms of its clustering generalization error.
arXiv Detail & Related papers (2020-05-11T10:06:40Z) - Fast Kernel k-means Clustering Using Incomplete Cholesky Factorization [11.631064399465089]
Kernel-based clustering algorithm can identify and capture the non-linear structure in datasets.
It can achieve better performance than linear clustering.
computing and storing the entire kernel matrix occupy so large memory that it is difficult for kernel-based clustering to deal with large-scale datasets.
arXiv Detail & Related papers (2020-02-07T15:32:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.