Quantum Polar Metric Learning: Efficient Classically Learned Quantum
Embeddings
- URL: http://arxiv.org/abs/2312.01655v3
- Date: Tue, 27 Feb 2024 06:07:04 GMT
- Title: Quantum Polar Metric Learning: Efficient Classically Learned Quantum
Embeddings
- Authors: Vinayak Sharma and Aviral Shrivastava
- Abstract summary: We propose Quantum Polar Metric Learning (QPMeL) that uses a classical model to learn the parameters of the polar form of a qubit.
We then utilize a shallow PQC with $R_y$ and $R_z$ gates to create the state and a trainable layer of $ZZ(theta)$-gates to learn entanglement.
When compared to QMeL approaches, QPMeL achieves 3X better multi-class separation, while using only 1/2 the number of gates and depth.
- Score: 0.984462697073239
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Deep metric learning has recently shown extremely promising results in the
classical data domain, creating well-separated feature spaces. This idea was
also adapted to quantum computers via Quantum Metric Learning(QMeL). QMeL
consists of a 2 step process with a classical model to compress the data to fit
into the limited number of qubits, then train a Parameterized Quantum
Circuit(PQC) to create better separation in Hilbert Space. However, on Noisy
Intermediate Scale Quantum (NISQ) devices. QMeL solutions result in high
circuit width and depth, both of which limit scalability. We propose Quantum
Polar Metric Learning (QPMeL) that uses a classical model to learn the
parameters of the polar form of a qubit. We then utilize a shallow PQC with
$R_y$ and $R_z$ gates to create the state and a trainable layer of
$ZZ(\theta)$-gates to learn entanglement. The circuit also computes fidelity
via a SWAP Test for our proposed Fidelity Triplet Loss function, used to train
both classical and quantum components. When compared to QMeL approaches, QPMeL
achieves 3X better multi-class separation, while using only 1/2 the number of
gates and depth. We also demonstrate that QPMeL outperforms classical networks
with similar configurations, presenting a promising avenue for future research
on fully classical models with quantum loss functions.
Related papers
- Quantum Deep Equilibrium Models [1.5853439776721878]
We present Quantum Deep Equilibrium Models (QDEQ), a training paradigm that learns parameters of a quantum machine learning model.
We find that QDEQ is not only competitive with comparable existing baseline models, but also achieves higher performance than a network with 5 times more layers.
This demonstrates that the QDEQ paradigm can be used to develop significantly more shallow quantum circuits for a given task.
arXiv Detail & Related papers (2024-10-31T13:54:37Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - QuantumSEA: In-Time Sparse Exploration for Noise Adaptive Quantum
Circuits [82.50620782471485]
QuantumSEA is an in-time sparse exploration for noise-adaptive quantum circuits.
It aims to achieve two key objectives: (1) implicit circuits capacity during training and (2) noise robustness.
Our method establishes state-of-the-art results with only half the number of quantum gates and 2x time saving of circuit executions.
arXiv Detail & Related papers (2024-01-10T22:33:00Z) - MNISQ: A Large-Scale Quantum Circuit Dataset for Machine Learning on/for
Quantum Computers in the NISQ era [2.652805765181667]
MNISQ consists of 4,950,000 data points organized in 9 subdatasets.
We deliver a dataset in a dual form: in quantum form, as circuits, and in classical form, as quantum circuit descriptions.
In the quantum endeavor, we test our circuit dataset with Quantum Kernel methods, and we show excellent results up to $97%$ accuracy.
arXiv Detail & Related papers (2023-06-29T02:04:14Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Anticipative measurements in hybrid quantum-classical computation [68.8204255655161]
We present an approach where the quantum computation is supplemented by a classical result.
Taking advantage of its anticipation also leads to a new type of quantum measurements, which we call anticipative.
In an anticipative quantum measurement the combination of the results from classical and quantum computations happens only in the end.
arXiv Detail & Related papers (2022-09-12T15:47:44Z) - Towards AutoQML: A Cloud-Based Automated Circuit Architecture Search
Framework [0.0]
We take the first steps towards Automated Quantum Machine Learning (AutoQML)
We propose a concrete description of the problem, and then develop a classical-quantum hybrid cloud architecture.
As an application use-case, we train a quantum Geneversarative Adrial neural Network (qGAN) to generate energy prices that follow a known historic data distribution.
arXiv Detail & Related papers (2022-02-16T12:37:10Z) - Entangled Datasets for Quantum Machine Learning [0.0]
We argue that one should instead employ quantum datasets composed of quantum states.
We show how a quantum neural network can be trained to generate the states in the NTangled dataset.
We also consider an alternative entanglement-based dataset, which is scalable and is composed of states prepared by quantum circuits.
arXiv Detail & Related papers (2021-09-08T02:20:13Z) - Quantum embeddings for machine learning [5.16230883032882]
Quantum classifiers are trainable quantum circuits used as machine learning models.
We propose to train the first part of the circuit -- the embedding -- with the objective of maximally separating data classes in Hilbert space.
This approach provides a powerful analytic framework for quantum machine learning.
arXiv Detail & Related papers (2020-01-10T19:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.