Quantum Machine Learning via Contrastive Training
- URL: http://arxiv.org/abs/2511.13497v1
- Date: Mon, 17 Nov 2025 15:36:23 GMT
- Title: Quantum Machine Learning via Contrastive Training
- Authors: Liudmila A. Zhukas, Vivian Ni Zhang, Qiang Miao, Qingfeng Wang, Marko Cetina, Jungsang Kim, Lawrence Carin, Christopher Monroe,
- Abstract summary: We introduce self-supervised pretraining of quantum representations that reduces reliance on labeled data.<n>We implement this paradigm on a programmable trapped-ion quantum computer, encoding images as quantum states.<n>Performance improvement is especially significant in regimes with limited labeled training data.
- Score: 12.83661402071417
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum machine learning (QML) has attracted growing interest with the rapid parallel advances in large-scale classical machine learning and quantum technologies. Similar to classical machine learning, QML models also face challenges arising from the scarcity of labeled data, particularly as their scale and complexity increase. Here, we introduce self-supervised pretraining of quantum representations that reduces reliance on labeled data by learning invariances from unlabeled examples. We implement this paradigm on a programmable trapped-ion quantum computer, encoding images as quantum states. In situ contrastive pretraining on hardware yields a representation that, when fine-tuned, classifies image families with higher mean test accuracy and lower run-to-run variability than models trained from random initialization. Performance improvement is especially significant in regimes with limited labeled training data. We show that the learned invariances generalize beyond the pretraining image samples. Unlike prior work, our pipeline derives similarity from measured quantum overlaps and executes all training and classification stages on hardware. These results establish a label-efficient route to quantum representation learning, with direct relevance to quantum-native datasets and a clear path to larger classical inputs.
Related papers
- Typical Machine Learning Datasets as Low-Depth Quantum Circuits [0.04654705430482874]
We develop an efficient algorithm for finding low-depth quantum circuits to load classical image data as quantum states.<n>We conduct systematic studies on the MNIST, Fashion-MNIST, CIFAR-10, and Imagenette datasets.
arXiv Detail & Related papers (2025-05-06T10:27:51Z) - An Efficient Quantum Classifier Based on Hamiltonian Representations [50.467930253994155]
Quantum machine learning (QML) is a discipline that seeks to transfer the advantages of quantum computing to data-driven tasks.<n>We propose an efficient approach that circumvents the costs associated with data encoding by mapping inputs to a finite set of Pauli strings.<n>We evaluate our approach on text and image classification tasks, against well-established classical and quantum models.
arXiv Detail & Related papers (2025-04-13T11:49:53Z) - Empirical Power of Quantum Encoding Methods for Binary Classification [0.2118773996967412]
We will focus on encoding schemes and their effects on various machine learning metrics.<n>Specifically, we focus on real-world data encoding to demonstrate differences between quantum encoding strategies for several real-world datasets.
arXiv Detail & Related papers (2024-08-23T14:34:57Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [62.46800898243033]
Recent progress in quantum learning theory prompts a question: can linear properties of a large-qubit circuit be efficiently learned from measurement data generated by varying classical inputs?<n>We prove that the sample complexity scaling linearly in $d$ is required to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.<n>We propose a kernel-based method leveraging classical shadows and truncated trigonometric expansions, enabling a controllable trade-off between prediction accuracy and computational overhead.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - The curse of random quantum data [62.24825255497622]
We quantify the performances of quantum machine learning in the landscape of quantum data.
We find that the training efficiency and generalization capabilities in quantum machine learning will be exponentially suppressed with the increase in qubits.
Our findings apply to both the quantum kernel method and the large-width limit of quantum neural networks.
arXiv Detail & Related papers (2024-08-19T12:18:07Z) - Quantum Active Learning [3.3202982522589934]
Training a quantum neural network typically demands a substantial labeled training set for supervised learning.
QAL effectively trains the model, achieving performance comparable to that on fully labeled datasets.
We elucidate the negative result of QAL being overtaken by random sampling baseline through miscellaneous numerical experiments.
arXiv Detail & Related papers (2024-05-28T14:39:54Z) - Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective [7.7063925534143705]
We introduce the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with machine learning algorithms.
QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model.
arXiv Detail & Related papers (2024-05-18T14:35:57Z) - Exponential Quantum Communication Advantage in Distributed Inference and Learning [19.827903766111987]
We present a framework for distributed computation over a quantum network.
We show that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication.
We also show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth.
arXiv Detail & Related papers (2023-10-11T02:19:50Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - Generation of High-Resolution Handwritten Digits with an Ion-Trap
Quantum Computer [55.41644538483948]
We implement a quantum-circuit based generative model to learn and sample the prior distribution of a Generative Adversarial Network.
We train this hybrid algorithm on an ion-trap device based on $171$Yb$+$ ion qubits to generate high-quality images.
arXiv Detail & Related papers (2020-12-07T18:51:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.