Study of Feature Importance for Quantum Machine Learning Models
- URL: http://arxiv.org/abs/2202.11204v2
- Date: Thu, 24 Feb 2022 14:57:31 GMT
- Title: Study of Feature Importance for Quantum Machine Learning Models
- Authors: Aaron Baughman, Kavitha Yogaraj, Raja Hebbar, Sudeep Ghosh, Rukhsan Ul
Haq, Yoshika Chhabra
- Abstract summary: Predictor importance is a crucial part of data preprocessing pipelines in classical and quantum machine learning (QML)
This work presents the first study of its kind in which feature importance for QML models has been explored and contrasted against their classical machine learning (CML) equivalents.
We developed a hybrid quantum-classical architecture where QML models are trained and feature importance values are calculated from classical algorithms on a real-world dataset.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Predictor importance is a crucial part of data preprocessing pipelines in
classical and quantum machine learning (QML). This work presents the first
study of its kind in which feature importance for QML models has been explored
and contrasted against their classical machine learning (CML) equivalents. We
developed a hybrid quantum-classical architecture where QML models are trained
and feature importance values are calculated from classical algorithms on a
real-world dataset. This architecture has been implemented on ESPN Fantasy
Football data using Qiskit statevector simulators and IBM quantum hardware such
as the IBMQ Mumbai and IBMQ Montreal systems. Even though we are in the Noisy
Intermediate-Scale Quantum (NISQ) era, the physical quantum computing results
are promising. To facilitate current quantum scale, we created a data tiering,
model aggregation, and novel validation methods. Notably, the feature
importance magnitudes from the quantum models had a much higher variation when
contrasted to classical models. We can show that equivalent QML and CML models
are complementary through diversity measurements. The diversity between QML and
CML demonstrates that both approaches can contribute to a solution in different
ways. Within this paper we focus on Quantum Support Vector Classifiers (QSVC),
Variational Quantum Circuit (VQC), and their classical counterparts. The ESPN
and IBM fantasy footballs Trade Assistant combines advanced statistical
analysis with the natural language processing of Watson Discovery to serve up
personalized trade recommendations that are fair and proposes a trade. Here,
player valuation data of each player has been considered and this work can be
extended to calculate the feature importance of other QML models such as
Quantum Boltzmann machines.
Related papers
- Security Concerns in Quantum Machine Learning as a Service [2.348041867134616]
Quantum machine learning (QML) is a category of algorithms that employ variational quantum circuits (VQCs) to tackle machine learning tasks.
Recent discoveries have shown that QML models can effectively generalize from limited training data samples.
QML represents a hybrid model that utilizes both classical and quantum computing resources.
arXiv Detail & Related papers (2024-08-18T18:21:24Z) - The Quantum Imitation Game: Reverse Engineering of Quantum Machine Learning Models [2.348041867134616]
Quantum Machine Learning (QML) amalgamates quantum computing paradigms with machine learning models.
With the expansion of numerous third-party vendors in the Noisy Intermediate-Scale Quantum (NISQ) era of quantum computing, the security of QML models is of prime importance.
We assume the untrusted quantum cloud provider is an adversary having white-box access to the transpiled user-designed trained QML model during inference.
arXiv Detail & Related papers (2024-07-09T21:35:19Z) - Feature Importance and Explainability in Quantum Machine Learning [0.0]
Many Machine Learning (ML) models are referred to as black box models, providing no real insights into why a prediction is made.
This article explores feature importance and explainability in Quantum Machine Learning (QML) compared to Classical ML models.
arXiv Detail & Related papers (2024-05-14T19:12:32Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - Classical-to-Quantum Transfer Learning Facilitates Machine Learning with Variational Quantum Circuit [62.55763504085508]
We prove that a classical-to-quantum transfer learning architecture using a Variational Quantum Circuit (VQC) improves the representation and generalization (estimation error) capabilities of the VQC model.
We show that the architecture of classical-to-quantum transfer learning leverages pre-trained classical generative AI models, making it easier to find the optimal parameters for the VQC in the training stage.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Subtleties in the trainability of quantum machine learning models [0.0]
We show that gradient scaling results for Variational Quantum Algorithms can be applied to study the gradient scaling of Quantum Machine Learning models.
Our results indicate that features deemed detrimental for VQA trainability can also lead to issues such as barren plateaus in QML.
arXiv Detail & Related papers (2021-10-27T20:28:53Z) - Entangled Datasets for Quantum Machine Learning [0.0]
We argue that one should instead employ quantum datasets composed of quantum states.
We show how a quantum neural network can be trained to generate the states in the NTangled dataset.
We also consider an alternative entanglement-based dataset, which is scalable and is composed of states prepared by quantum circuits.
arXiv Detail & Related papers (2021-09-08T02:20:13Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.