Parametric t-Stochastic Neighbor Embedding With Quantum Neural Network
- URL: http://arxiv.org/abs/2202.04238v1
- Date: Wed, 9 Feb 2022 02:49:54 GMT
- Title: Parametric t-Stochastic Neighbor Embedding With Quantum Neural Network
- Authors: Yoshiaki Kawase, Kosuke Mitarai, Keisuke Fujii
- Abstract summary: t-Stochastic Neighbor Embedding (t-SNE) is a non-parametric data visualization method in classical machine learning.
We propose to use quantum neural networks for parametric t-SNE to reflect the characteristics of high-dimensional quantum data on low-dimensional data.
- Score: 0.6946929968559495
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: t-Stochastic Neighbor Embedding (t-SNE) is a non-parametric data
visualization method in classical machine learning. It maps the data from the
high-dimensional space into a low-dimensional space, especially a
two-dimensional plane, while maintaining the relationship, or similarities,
between the surrounding points. In t-SNE, the initial position of the
low-dimensional data is randomly determined, and the visualization is achieved
by moving the low-dimensional data to minimize a cost function. Its variant
called parametric t-SNE uses neural networks for this mapping. In this paper,
we propose to use quantum neural networks for parametric t-SNE to reflect the
characteristics of high-dimensional quantum data on low-dimensional data. We
use fidelity-based metrics instead of Euclidean distance in calculating
high-dimensional data similarity. We visualize both classical (Iris dataset)
and quantum (time-depending Hamiltonian dynamics) data for classification
tasks. Since this method allows us to represent a quantum dataset in a higher
dimensional Hilbert space by a quantum dataset in a lower dimension while
keeping their similarity, the proposed method can also be used to compress
quantum data for further quantum machine learning.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - An example of use of Variational Methods in Quantum Machine Learning [0.0]
This paper introduces a quantum neural network for the binary classification of points of a specific geometric pattern on a plane.
The intention was to produce a quantum deep neural network with the minimum number of trainable parameters capable of correctly recognising and classifying points.
arXiv Detail & Related papers (2022-08-07T03:52:42Z) - Intrinsic dimension estimation for discrete metrics [65.5438227932088]
In this letter we introduce an algorithm to infer the intrinsic dimension (ID) of datasets embedded in discrete spaces.
We demonstrate its accuracy on benchmark datasets, and we apply it to analyze a metagenomic dataset for species fingerprinting.
This suggests that evolutive pressure acts on a low-dimensional manifold despite the high-dimensionality of sequences' space.
arXiv Detail & Related papers (2022-07-20T06:38:36Z) - Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach [47.19265172105025]
We propose a novel hybrid quantum-classical algorithm for graph-structured data, which we refer to as the Ego-graph based Quantum Graph Neural Network (egoQGNN)
egoQGNN implements the GNN theoretical framework using the tensor product and unity matrix representation, which greatly reduces the number of model parameters required.
The architecture is based on a novel mapping from real-world data to Hilbert space.
arXiv Detail & Related papers (2022-01-13T16:35:45Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - Tree tensor network classifiers for machine learning: from
quantum-inspired to quantum-assisted [0.0]
We describe a quantum-assisted machine learning (QAML) method in which multivariate data is encoded into quantum states in a Hilbert space whose dimension is exponentially large in the length of the data vector.
We present an approach that can be implemented on gate-based quantum computing devices.
arXiv Detail & Related papers (2021-04-06T02:31:48Z) - A Local Similarity-Preserving Framework for Nonlinear Dimensionality
Reduction with Neural Networks [56.068488417457935]
We propose a novel local nonlinear approach named Vec2vec for general purpose dimensionality reduction.
To train the neural network, we build the neighborhood similarity graph of a matrix and define the context of data points.
Experiments of data classification and clustering on eight real datasets show that Vec2vec is better than several classical dimensionality reduction methods in the statistical hypothesis test.
arXiv Detail & Related papers (2021-03-10T23:10:47Z) - High-Dimensional Similarity Search with Quantum-Assisted Variational
Autoencoder [3.6704555687356644]
Quantum machine learning is touted as a potential approach to demonstrate quantum advantage.
We show how to construct a space-efficient search index based on the latent space representation of a QVAE.
We find real-world speedups compared to linear search and demonstrate memory-efficient scaling to half a billion data points.
arXiv Detail & Related papers (2020-06-13T16:55:23Z) - Quantum embeddings for machine learning [5.16230883032882]
Quantum classifiers are trainable quantum circuits used as machine learning models.
We propose to train the first part of the circuit -- the embedding -- with the objective of maximally separating data classes in Hilbert space.
This approach provides a powerful analytic framework for quantum machine learning.
arXiv Detail & Related papers (2020-01-10T19:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.