Clustering by quantum annealing on three-level quantum elements qutrits
- URL: http://arxiv.org/abs/2102.09205v1
- Date: Thu, 18 Feb 2021 08:06:44 GMT
- Title: Clustering by quantum annealing on three-level quantum elements qutrits
- Authors: V. E. Zobov and I. S. Pichkovskiy
- Abstract summary: Clustering is grouping of data by the proximity of some properties.
We report on the possibility of increasing the efficiency of clustering of points in a plane using artificial quantum neural networks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Clustering is grouping of data by the proximity of some properties. We report
on the possibility of increasing the efficiency of clustering of points in a
plane using artificial quantum neural networks after the replacement of the
two-level neurons called qubits represented by the spins S = 1/2 by the
three-level neurons called qutrits represented by the spins S = 1. The problem
has been solved by the slow adiabatic change of the Hamiltonian in time. The
methods for controlling a qutrit system using projection operators have been
developed and the numerical simulation has been performed. The Hamiltonians for
two well-known cluster-ing methods, one-hot encoding and k-means ++, have been
built. The first method has been used to partition a set of six points into
three or two clusters and the second method, to partition a set of nine points
into three clusters and seven points into four clusters. The simulation has
shown that the clustering problem can be ef-fectively solved on qutrits
represented by the spins S = 1. The advantages of clustering on qutrits over
that on qubits have been demonstrated. In particular, the number of qutrits
required to represent data points is smaller than the number of qubits by a
factor of log2 N / log3 N . Since, for qutrits, it is easier to partition the
data points into three clusters rather than two ones, the approximate
hierarchical procedure of data partition-ing into a larger number of clusters
is accelerated. At the exact data partition into more than three clusters, it
has been proposed to number the clusters by the numbers of states of the
corresponding multi-spin subsys-tems, instead of using the numbers of
individual spins. This reduces even more the number of qutrits (N log3 K
instead of NK ) required to implement the algorithm.
Related papers
- Clustering Based on Density Propagation and Subcluster Merging [92.15924057172195]
We propose a density-based node clustering approach that automatically determines the number of clusters and can be applied in both data space and graph space.
Unlike traditional density-based clustering methods, which necessitate calculating the distance between any two nodes, our proposed technique determines density through a propagation process.
arXiv Detail & Related papers (2024-11-04T04:09:36Z) - Instance-Optimal Cluster Recovery in the Labeled Stochastic Block Model [79.46465138631592]
We devise an efficient algorithm that recovers clusters using the observed labels.
We present Instance-Adaptive Clustering (IAC), the first algorithm whose performance matches these lower bounds both in expectation and with high probability.
arXiv Detail & Related papers (2023-06-18T08:46:06Z) - Rethinking k-means from manifold learning perspective [122.38667613245151]
We present a new clustering algorithm which directly detects clusters of data without mean estimation.
Specifically, we construct distance matrix between data points by Butterworth filter.
To well exploit the complementary information embedded in different views, we leverage the tensor Schatten p-norm regularization.
arXiv Detail & Related papers (2023-05-12T03:01:41Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - DRBM-ClustNet: A Deep Restricted Boltzmann-Kohonen Architecture for Data
Clustering [0.0]
A Bayesian Deep Restricted Boltzmann-Kohonen architecture for data clustering termed as DRBM-ClustNet is proposed.
The processing of unlabeled data is done in three stages for efficient clustering of the non-linearly separable datasets.
The framework is evaluated based on clustering accuracy and ranked against other state-of-the-art clustering methods.
arXiv Detail & Related papers (2022-05-13T15:12:18Z) - Fast and explainable clustering based on sorting [0.0]
We introduce a fast and explainable clustering method called CLASSIX.
The algorithm is controlled by two scalar parameters, namely a distance parameter for the aggregation and another parameter controlling the minimal cluster size.
Our experiments demonstrate that CLASSIX competes with state-of-the-art clustering algorithms.
arXiv Detail & Related papers (2022-02-03T08:24:21Z) - K-Splits: Improved K-Means Clustering Algorithm to Automatically Detect
the Number of Clusters [0.12313056815753944]
This paper introduces k-splits, an improved hierarchical algorithm based on k-means to cluster data without prior knowledge of the number of clusters.
Accuracy and speed are two main advantages of the proposed method.
arXiv Detail & Related papers (2021-10-09T23:02:57Z) - Experimental Determination of Multi-Qubit Ground State via a Cluster
Mean-Field Algorithm [1.9790421227325208]
A quantum eigensolver is designed under a multi-layer cluster mean-field algorithm.
The method is numerically verified in multi-spin chains and experimentally studied in a fully-connected three-spin network.
arXiv Detail & Related papers (2021-10-03T07:12:45Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z) - Quantum Spectral Clustering [5.414308305392762]
Spectral clustering is a powerful machine learning algorithm for clustering data with non convex or nested structures.
We propose an end-to-end quantum algorithm spectral clustering, extending a number of works in quantum machine learning.
arXiv Detail & Related papers (2020-07-01T07:11:42Z) - Ball k-means [53.89505717006118]
The Ball k-means algorithm uses a ball to describe a cluster, focusing on reducing the point-centroid distance computation.
The fast speed, no extra parameters and simple design of the Ball k-means make it an all-around replacement of the naive k-means algorithm.
arXiv Detail & Related papers (2020-05-02T10:39:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.