Comparative Analysis of QNN Architectures for Wind Power Prediction: Feature Maps and Ansatz Configurations
- URL: http://arxiv.org/abs/2506.14795v1
- Date: Sat, 31 May 2025 19:17:53 GMT
- Title: Comparative Analysis of QNN Architectures for Wind Power Prediction: Feature Maps and Ansatz Configurations
- Authors: Batuhan Hangun, Emine Akpinar, Oguz Altun, Onder Eyecioglu,
- Abstract summary: Quantum Machine Learning (QML) aims to enhance classical machine learning methods by leveraging quantum mechanics principles such as entanglement and superposition.<n>This study extensively assesses Quantum Neural Networks (QNNs)-quantum-inspired counterparts of Artificial Neural Networks (ANNs)<n>We show that QNNs outperform classical methods in predictive tasks, underscoring the potential of QML in real-world applications.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum Machine Learning (QML) is an emerging field at the intersection of quantum computing and machine learning, aiming to enhance classical machine learning methods by leveraging quantum mechanics principles such as entanglement and superposition. However, skepticism persists regarding the practical advantages of QML, mainly due to the current limitations of noisy intermediate-scale quantum (NISQ) devices. This study addresses these concerns by extensively assessing Quantum Neural Networks (QNNs)-quantum-inspired counterparts of Artificial Neural Networks (ANNs), demonstrating their effectiveness compared to classical methods. We systematically construct and evaluate twelve distinct QNN configurations, utilizing two unique quantum feature maps combined with six different entanglement strategies for ansatz design. Experiments conducted on a wind energy dataset reveal that QNNs employing the Z feature map achieve up to 93% prediction accuracy when forecasting wind power output using only four input parameters. Our findings show that QNNs outperform classical methods in predictive tasks, underscoring the potential of QML in real-world applications.
Related papers
- Quantum Neural Networks for Wind Energy Forecasting: A Comparative Study of Performance and Scalability with Classical Models [0.0]
Quantum Neural Networks (QNNs) are emerging as a powerful alternative to classical machine learning methods.<n>This study provides an in-depth investigation of QNNs for predicting the power output of a wind turbine.<n>We experimentally demonstrate that QNNs can achieve predictive performance that is competitive with, and in some cases marginally better than, the benchmarked classical approaches.
arXiv Detail & Related papers (2025-06-28T10:51:27Z) - RhoDARTS: Differentiable Quantum Architecture Search with Density Matrix Simulations [48.670876200492415]
Variational Quantum Algorithms (VQAs) are a promising approach for leveraging powerful Noisy Intermediate-Scale Quantum (NISQ) computers.<n>We propose $rho$DARTS, a differentiable Quantum Architecture Search (QAS) algorithm that models the search process as the evolution of a quantum mixed state.
arXiv Detail & Related papers (2025-06-04T08:30:35Z) - Differentiable Quantum Architecture Search in Quantum-Enhanced Neural Network Parameter Generation [4.358861563008207]
Quantum neural networks (QNNs) have shown promise both empirically and theoretically.<n> Hardware imperfections and limited access to quantum devices pose practical challenges.<n>We propose an automated solution using differentiable optimization.
arXiv Detail & Related papers (2025-05-13T19:01:08Z) - The inherent convolution property of quantum neural networks [1.799933345199395]
Quantum neural networks (QNNs) represent a pioneering intersection of quantum computing and deep learning.<n>We unveil a fundamental convolution property inherent to QNNs, stemming from the natural parallelism of quantum gate operations on quantum states.<n>We propose novel QCNN architectures that explicitly harness the convolutional nature of QNNs.
arXiv Detail & Related papers (2025-04-11T12:30:17Z) - Training Classical Neural Networks by Quantum Machine Learning [9.002305736350833]
This work proposes a training scheme for classical neural networks (NNs) that utilizes the exponentially large Hilbert space of a quantum system.
Unlike existing quantum machine learning (QML) methods, the results obtained from quantum computers using our approach can be directly used on classical computers.
arXiv Detail & Related papers (2024-02-26T10:16:21Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Quantum Imitation Learning [74.15588381240795]
We propose quantum imitation learning (QIL) with a hope to utilize quantum advantage to speed up IL.
We develop two QIL algorithms, quantum behavioural cloning (Q-BC) and quantum generative adversarial imitation learning (Q-GAIL)
Experiment results demonstrate that both Q-BC and Q-GAIL can achieve comparable performance compared to classical counterparts.
arXiv Detail & Related papers (2023-04-04T12:47:35Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Evaluating the performance of sigmoid quantum perceptrons in quantum
neural networks [0.0]
Quantum neural networks (QNN) have been proposed as a promising architecture for quantum machine learning.
One candidate is quantum perceptrons designed to emulate the nonlinear activation functions of classical perceptrons.
We critically investigate both the capabilities and performance of SQP networks by computing their effective dimension and effective capacity.
arXiv Detail & Related papers (2022-08-12T10:08:11Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.