Applications of Hybrid Machine Learning Methods to Large Datasets: A Case Study
- URL: http://arxiv.org/abs/2504.06892v1
- Date: Wed, 09 Apr 2025 13:53:27 GMT
- Title: Applications of Hybrid Machine Learning Methods to Large Datasets: A Case Study
- Authors: G. Maragkopoulos, N. Stefanakos, A. Mandilara, D. Syvridis,
- Abstract summary: We show that replacing a deep classical neural network with a thoughtfully designed Variational Quantum Circuit (VQC) in an ML pipeline for multiclass classification of time-series data yields the same classification performance.<n>Our results highlight the importance of tailored data pre-processing for the circuit and show the potential of qudit-based VQCs.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We combine classical and quantum Machine Learning (ML) techniques to effectively analyze long time-series data acquired during experiments. Specifically, we demonstrate that replacing a deep classical neural network with a thoughtfully designed Variational Quantum Circuit (VQC) in an ML pipeline for multiclass classification of time-series data yields the same classification performance, while significantly reducing the number of trainable parameters. To achieve this, we use a VQC based on a single qudit, and encode the classical data into the VQC via a trainable hybrid autoencoder which has been recently proposed as embedding technique. Our results highlight the importance of tailored data pre-processing for the circuit and show the potential of qudit-based VQCs.
Related papers
- Enhancing the performance of Variational Quantum Classifiers with hybrid autoencoders [0.0]
We propose an alternative method which reduces the dimensionality of a given dataset by taking into account the specific quantum embedding that comes after.
This method aspires to make quantum machine learning with VQCs more versatile and effective on datasets of high dimension.
arXiv Detail & Related papers (2024-09-05T08:51:20Z) - Weight Re-Mapping for Variational Quantum Algorithms [54.854986762287126]
We introduce the concept of weight re-mapping for variational quantum circuits (VQCs)
We employ seven distinct weight re-mapping functions to assess their impact on eight classification datasets.
Our results indicate that weight re-mapping can enhance the convergence speed of the VQC.
arXiv Detail & Related papers (2023-06-09T09:42:21Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - BCQQ: Batch-Constraint Quantum Q-Learning with Cyclic Data Re-uploading [2.502222151305252]
Recent advancements in quantum computing suggest that quantum models might require less data for training compared to classical methods.
We propose a batch RL algorithm that utilizes VQC as function approximators within the discrete batch-constraint deep Q-learning algorithm.
We evaluate the efficiency of our algorithm on the OpenAI CartPole environment and compare its performance to the classical neural network-based discrete BCQ.
arXiv Detail & Related papers (2023-04-27T16:43:01Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - Improving Convergence for Quantum Variational Classifiers using Weight
Re-Mapping [60.086820254217336]
In recent years, quantum machine learning has seen a substantial increase in the use of variational quantum circuits (VQCs)
We introduce weight re-mapping for VQCs, to unambiguously map the weights to an interval of length $2pi$.
We demonstrate that weight re-mapping increased test accuracy for the Wine dataset by $10%$ over using unmodified weights.
arXiv Detail & Related papers (2022-12-22T13:23:19Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Hybrid Classical-Quantum Autoencoder for Anomaly Detection [0.0]
We propose a Hybrid classical-quantum Autoencoder (HAE) model, which is a synergy of a classical autoencoder (AE) and a parametrized quantum circuit (PQC)
The PQC augments the latent space, on which a standard outlier detection method is applied to search for anomalous data points within a classical dataset.
We show that the addition of the PQC leads to a performance enhancement in terms of precision, recall, and F1 score.
arXiv Detail & Related papers (2021-12-16T13:27:24Z) - Hybrid Quantum-Classical Graph Convolutional Network [7.0132255816377445]
This research provides a hybrid quantum-classical graph convolutional network (QGCNN) for learning HEP data.
The proposed framework demonstrates an advantage over classical multilayer perceptron and convolutional neural networks in the aspect of number of parameters.
In terms of testing accuracy, the QGCNN shows comparable performance to a quantum convolutional neural network on the same HEP dataset.
arXiv Detail & Related papers (2021-01-15T16:02:52Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Characterizing the loss landscape of variational quantum circuits [77.34726150561087]
We introduce a way to compute the Hessian of the loss function of VQCs.
We show how this information can be interpreted and compared to classical neural networks.
arXiv Detail & Related papers (2020-08-06T17:48:12Z) - Quantum-enhanced data classification with a variational entangled sensor
network [3.1083620257082707]
Supervised learning assisted by an entangled sensor network (SLAEN) is a distinct paradigm that harnesses VQCs trained by classical machine-learning algorithms.
Our work paves a new route for quantum-enhanced data processing and its applications in the NISQ era.
arXiv Detail & Related papers (2020-06-22T01:22:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.