Quantum Machine Learning for Climate Modelling
- URL: http://arxiv.org/abs/2512.14208v1
- Date: Tue, 16 Dec 2025 09:08:30 GMT
- Title: Quantum Machine Learning for Climate Modelling
- Authors: Mierk Schwabe, Lorenzo Pastori, Valentina Sarandrea, Veronika Eyring,
- Abstract summary: We present work on using a quantum neural net (QNN) to develop a parameterization of cloud cover for an Earth system model (ESM)<n>We show that a QNN can predict cloud cover with a performance similar to a classical NN with the same number of free parameters and significantly better than the traditional scheme.
- Score: 0.05117223729072082
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantum machine learning (QML) is making rapid progress, and QML-based models hold the promise of quantum advantages such as potentially higher expressivity and generalizability than their classical counterparts. Here, we present work on using a quantum neural net (QNN) to develop a parameterization of cloud cover for an Earth system model (ESM). ESMs are needed for predicting and projecting climate change, and can be improved in hybrid models incorporating both traditional physics-based components as well as machine learning (ML) models. We show that a QNN can predict cloud cover with a performance similar to a classical NN with the same number of free parameters and significantly better than the traditional scheme. We also analyse the learning capability of the QNN in comparison to the classical NN and show that, at least for our example, QNNs learn more consistent relationships than classical NNs.
Related papers
- Hybrid Quantum-Classical Neural Networks for Few-Shot Credit Risk Assessment [52.05742536403784]
This work tackles the challenge of few-shot credit risk assessment.<n>We design and implement a novel hybrid quantum-classical workflow.<n>A Quantum Neural Network (QNN) was trained via the parameter-shift rule.<n>On a real-world credit dataset of 279 samples, our QNN achieved a robust average AUC of 0.852 +/- 0.027 in simulations and yielded an impressive AUC of 0.88 in the hardware experiment.
arXiv Detail & Related papers (2025-09-17T08:36:05Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Comparative Analysis of QNN Architectures for Wind Power Prediction: Feature Maps and Ansatz Configurations [0.0]
Quantum Machine Learning (QML) aims to enhance classical machine learning methods by leveraging quantum mechanics principles such as entanglement and superposition.<n>This study extensively assesses Quantum Neural Networks (QNNs)-quantum-inspired counterparts of Artificial Neural Networks (ANNs)<n>We show that QNNs outperform classical methods in predictive tasks, underscoring the potential of QML in real-world applications.
arXiv Detail & Related papers (2025-05-31T19:17:53Z) - Multi-Scale Feature Fusion Quantum Depthwise Convolutional Neural Networks for Text Classification [3.0079490585515343]
We propose a novel quantum neural network (QNN) model based on quantum convolution.
We develop the quantum depthwise convolution that significantly reduces the number of parameters and lowers computational complexity.
We also introduce the multi-scale feature fusion mechanism to enhance model performance by integrating word-level and sentence-level features.
arXiv Detail & Related papers (2024-05-22T10:19:34Z) - Training Classical Neural Networks by Quantum Machine Learning [9.002305736350833]
This work proposes a training scheme for classical neural networks (NNs) that utilizes the exponentially large Hilbert space of a quantum system.
Unlike existing quantum machine learning (QML) methods, the results obtained from quantum computers using our approach can be directly used on classical computers.
arXiv Detail & Related papers (2024-02-26T10:16:21Z) - Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
We present a novel framework for transferring knowledge from classical convolutional neural networks (CNNs) to quantum neural networks (QNNs)<n>We conduct extensive experiments using two parameterized quantum circuits (PQCs) with 4 and 8 qubits on MNIST, Fashion MNIST, and CIFAR10 datasets.<n>Our results establish a promising paradigm for bridging classical deep learning and emerging quantum computing, paving the way for more powerful, resource conscious models in quantum machine intelligence.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Classical-to-quantum convolutional neural network transfer learning [1.9336815376402723]
Machine learning using quantum convolutional neural networks (QCNNs) has demonstrated success in both quantum and classical data classification.
We propose transfer learning as an effective strategy for utilizing small QCNNs in the noisy intermediate-scale quantum era.
arXiv Detail & Related papers (2022-08-31T09:15:37Z) - Introducing Non-Linearity into Quantum Generative Models [0.0]
We introduce a model that adds non-linear activations via a neural network structure onto the standard Born Machine framework.
We compare our non-linear QNBM to the linear Quantum Circuit Born Machine.
We show that while both models can easily learn a trivial uniform probability distribution, the QNBM achieves an almost 3x smaller error rate than a QCBM.
arXiv Detail & Related papers (2022-05-28T18:59:49Z) - The dilemma of quantum neural networks [63.82713636522488]
We show that quantum neural networks (QNNs) fail to provide any benefit over classical learning models.
QNNs suffer from the severely limited effective model capacity, which incurs poor generalization on real-world datasets.
These results force us to rethink the role of current QNNs and to design novel protocols for solving real-world problems with quantum advantages.
arXiv Detail & Related papers (2021-06-09T10:41:47Z) - Toward Trainability of Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) have been proposed as generalizations of classical neural networks to achieve the quantum speed-up.
Serious bottlenecks exist for training QNNs due to the vanishing with gradient rate exponential to the input qubit number.
We show that QNNs with tree tensor and step controlled structures for the application of binary classification. Simulations show faster convergent rates and better accuracy compared to QNNs with random structures.
arXiv Detail & Related papers (2020-11-12T08:32:04Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.