Hybrid quantum-classical graph neural networks for tumor classification
in digital pathology
- URL: http://arxiv.org/abs/2310.11353v1
- Date: Tue, 17 Oct 2023 15:40:26 GMT
- Title: Hybrid quantum-classical graph neural networks for tumor classification
in digital pathology
- Authors: Anupama Ray, Dhiraj Madan, Srushti Patil, Maria Anna Rapsomaniki,
Pushpak Pati
- Abstract summary: We create a hybrid quantum graph neural network (GNN) that combines GNN with a Variational Quantum (VQC) for classifying binary subtasks in breast cancer subtyping.
The results demonstrate that the hybrid quantum neural network (QNN) is at par with the state-of-the-art classical graph neural networks (GNN) in terms of weighted precision, recall and F1-score.
- Score: 2.682579230647868
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Advances in classical machine learning and single-cell technologies have
paved the way to understand interactions between disease cells and tumor
microenvironments to accelerate therapeutic discovery. However, challenges in
these machine learning methods and NP-hard problems in spatial Biology create
an opportunity for quantum computing algorithms. We create a hybrid
quantum-classical graph neural network (GNN) that combines GNN with a
Variational Quantum Classifier (VQC) for classifying binary sub-tasks in breast
cancer subtyping. We explore two variants of the same, the first with fixed
pretrained GNN parameters and the second with end-to-end training of GNN+VQC.
The results demonstrate that the hybrid quantum neural network (QNN) is at par
with the state-of-the-art classical graph neural networks (GNN) in terms of
weighted precision, recall and F1-score. We also show that by means of
amplitude encoding, we can compress information in logarithmic number of qubits
and attain better performance than using classical compression (which leads to
information loss while keeping the number of qubits required constant in both
regimes). Finally, we show that end-to-end training enables to improve over
fixed GNN parameters and also slightly improves over vanilla GNN with same
number of dimensions.
Related papers
- Studying the Impact of Quantum-Specific Hyperparameters on Hybrid Quantum-Classical Neural Networks [4.951980887762045]
hybrid quantum-classical neural networks (HQNNs) represent a promising solution that combines the strengths of classical machine learning with quantum computing capabilities.
In this paper, we investigate the impact of these variations on different HQNN models for image classification tasks, implemented on the PennyLane framework.
We aim to uncover intuitive and counter-intuitive learning patterns of HQNN models within granular levels of controlled quantum perturbations, to form a sound basis for their correlation to accuracy and training time.
arXiv Detail & Related papers (2024-02-16T11:44:25Z) - Variational Quantum Neural Networks (VQNNS) in Image Classification [0.0]
This paper investigates how training of quantum neural network (QNNs) can be done using quantum optimization algorithms.
In this paper, a QNN structure is made where a variational parameterized circuit is incorporated as an input layer named as Variational Quantum Neural Network (VQNNs)
VQNNs is experimented with MNIST digit recognition (less complex) and crack image classification datasets which converge the computation in lesser time than QNN with decent training accuracy.
arXiv Detail & Related papers (2023-03-10T11:24:32Z) - Physics Simulation Via Quantum Graph Neural Network [0.0]
We develop and implement two realizations of quantum graph neural networks (QGNN)
The first QGNN is a speculative quantum-classical hybrid learning model that relies on the ability to directly implement superposition states as classical information.
The second is a feasible quantum-classical hybrid learning model that propagates particle information directly through the parameters of $RX$ rotation gates.
arXiv Detail & Related papers (2023-01-11T20:21:10Z) - Problem-Dependent Power of Quantum Neural Networks on Multi-Class
Classification [83.20479832949069]
Quantum neural networks (QNNs) have become an important tool for understanding the physical world, but their advantages and limitations are not fully understood.
Here we investigate the problem-dependent power of QCs on multi-class classification tasks.
Our work sheds light on the problem-dependent power of QNNs and offers a practical tool for evaluating their potential merit.
arXiv Detail & Related papers (2022-12-29T10:46:40Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Quantum-inspired Complex Convolutional Neural Networks [17.65730040410185]
We improve the quantum-inspired neurons by exploiting the complex-valued weights which have richer representational capacity and better non-linearity.
We draw the models of quantum-inspired convolutional neural networks (QICNNs) capable of processing high-dimensional data.
The performance of classification accuracy of the five QICNNs are tested on the MNIST and CIFAR-10 datasets.
arXiv Detail & Related papers (2021-10-31T03:10:48Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - Recurrent Quantum Neural Networks [7.6146285961466]
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning.
We construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks.
We evaluate the QRNN on MNIST classification, both by feeding the QRNN each image pixel-by-pixel; and by utilising modern data augmentation as preprocessing step.
arXiv Detail & Related papers (2020-06-25T17:59:44Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.