Satellite image classification with neural quantum kernels
- URL: http://arxiv.org/abs/2409.20356v1
- Date: Mon, 30 Sep 2024 14:52:00 GMT
- Title: Satellite image classification with neural quantum kernels
- Authors: Pablo Rodriguez-Grasa, Robert Farzan-Rodriguez, Gabriele Novelli, Yue Ban, Mikel Sanz,
- Abstract summary: We use quantum kernels to classify images which include solar panels.
In the latter, we iteratively train an $n$-qubit QNN to ensure scalability, using the resultant architecture to directly form an $n$-qubit EQK.
Results are robust against a suboptimal training of the QNN.
- Score: 0.0699049312989311
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A practical application of quantum machine learning in real-world scenarios in the short term remains elusive, despite significant theoretical efforts. Image classification, a common task for classical models, has been used to benchmark quantum algorithms with simple datasets, but only few studies have tackled complex real-data classification challenges. In this work, we address such a gap by focusing on the classification of satellite images, a task of particular interest to the earth observation (EO) industry. We first preprocess the selected intrincate dataset by reducing its dimensionality. Subsequently, we employ neural quantum kernels (NQKs)- embedding quantum kernels (EQKs) constructed from trained quantum neural networks (QNNs)- to classify images which include solar panels. We explore both $1$-to-$n$ and $n$-to-$n$ NQKs. In the former, parameters from a single-qubit QNN's training construct an $n$-qubit EQK achieving a mean test accuracy over 86% with three features. In the latter, we iteratively train an $n$-qubit QNN to ensure scalability, using the resultant architecture to directly form an $n$-qubit EQK. In this case, a test accuracy over 88% is obtained for three features and 8 qubits. Additionally, we show that the results are robust against a suboptimal training of the QNN.
Related papers
- Training embedding quantum kernels with data re-uploading quantum neural
networks [0.0]
Kernel methods play a crucial role in machine learning and the Embedding Quantum Kernels (EQKs) have shown very promising performance.
We propose a $p$-qubit Quantum Neural Network (QNN) based on data re-uploading to identify the optimal $q$-qubit EQK for a task.
arXiv Detail & Related papers (2024-01-09T16:08:32Z) - 3D Scalable Quantum Convolutional Neural Networks for Point Cloud Data
Processing in Classification Applications [10.90994913062223]
A quantum convolutional neural network (QCNN) is proposed for point cloud data processing in classification applications.
A novel 3D scalable QCNN (sQCNN-3D) is proposed for point cloud data processing in classification applications.
arXiv Detail & Related papers (2022-10-18T10:14:03Z) - Classification of NEQR Processed Classical Images using Quantum Neural
Networks (QNN) [0.0]
This work builds on previous works by the authors and addresses QNN for image classification with Novel Enhanced Quantum Representation of (NEQR)
We build an NEQR model circuit to pre-process the same data and feed the images into the QNN.
Our results showed marginal improvements (only about 5.0%) where the QNN performance with NEQR exceeded the performance of QNN without NEQR.
arXiv Detail & Related papers (2022-03-29T08:05:53Z) - On Circuit-based Hybrid Quantum Neural Networks for Remote Sensing
Imagery Classification [88.31717434938338]
The hybrid QCNNs enrich the classical architecture of CNNs by introducing a quantum layer within a standard neural network.
The novel QCNN proposed in this work is applied to the Land Use and Land Cover (LULC) classification, chosen as an Earth Observation (EO) use case.
The results of the multiclass classification prove the effectiveness of the presented approach, by demonstrating that the QCNN performances are higher than the classical counterparts.
arXiv Detail & Related papers (2021-09-20T12:41:50Z) - Quantum convolutional neural network for classical data classification [0.8057006406834467]
We benchmark fully parameterized quantum convolutional neural networks (QCNNs) for classical data classification.
We propose a quantum neural network model inspired by CNN that only uses two-qubit interactions throughout the entire algorithm.
arXiv Detail & Related papers (2021-08-02T06:48:34Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - A Quantum Convolutional Neural Network on NISQ Devices [0.9831489366502298]
We propose a quantum convolutional neural network inspired by convolutional neural networks.
Our model is robust to certain noise for image recognition tasks.
It opens up the prospect of exploiting quantum power to process information in the era of big data.
arXiv Detail & Related papers (2021-04-14T15:07:03Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z) - Widening and Squeezing: Towards Accurate and Efficient QNNs [125.172220129257]
Quantization neural networks (QNNs) are very attractive to the industry because their extremely cheap calculation and storage overhead, but their performance is still worse than that of networks with full-precision parameters.
Most of existing methods aim to enhance performance of QNNs especially binary neural networks by exploiting more effective training techniques.
We address this problem by projecting features in original full-precision networks to high-dimensional quantization features.
arXiv Detail & Related papers (2020-02-03T04:11:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.