Hybrid Quantum-Classical Recurrent Neural Networks
- URL: http://arxiv.org/abs/2510.25557v2
- Date: Tue, 04 Nov 2025 18:43:14 GMT
- Title: Hybrid Quantum-Classical Recurrent Neural Networks
- Authors: Wenduan Xu,
- Abstract summary: We present a hybrid quantum-classical recurrent neural network architecture.<n>The hidden state is the quantum state of an $n$-qubit quantum circuit (PQC) controlled by a classical feedforward network.<n>We evaluate the model in simulation with up to 14 qubits on sentiment analysis, MNIST, permuted MNIST, copying memory, and language modeling.
- Score: 0.10152838128195468
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a hybrid quantum-classical recurrent neural network (QRNN) architecture in which the recurrent core is realized as a parametrized quantum circuit (PQC) controlled by a classical feedforward network. The hidden state is the quantum state of an $n$-qubit PQC in an exponentially large Hilbert space $\mathbb{C}^{2^n}$, which serves as a coherent recurrent quantum memory. The PQC is unitary by construction, making the hidden-state evolution norm-preserving without external constraints. At each timestep, mid-circuit Pauli expectation-value readouts are combined with the input embedding and processed by the feedforward network, which provides explicit classical nonlinearity. The outputs parametrize the PQC, which updates the hidden state via unitary dynamics. The QRNN is compact and physically consistent, and it unifies (i) unitary recurrence as a high-capacity memory, (ii) partial observation via mid-circuit readouts, and (iii) nonlinear classical control for input-conditioned parametrization. We evaluate the model in simulation with up to 14 qubits on sentiment analysis, MNIST, permuted MNIST, copying memory, and language modeling. For sequence-to-sequence learning, we further devise a soft attention mechanism over the mid-circuit readouts and show its effectiveness for machine translation. To our knowledge, this is the first model (RNN or otherwise) grounded in quantum operations to achieve competitive performance against strong classical baselines across a broad class of sequence-learning tasks.
Related papers
- Quantum-Enhanced Neural Contextual Bandit Algorithms [50.880384999888044]
This paper introduces the Quantum Neural Tangent Kernel-Upper Confidence Bound (QNTK-UCB) algorithm.<n>QNTK-UCB is a novel algorithm that leverages the Quantum Neural Tangent Kernel (QNTK) to address these limitations.
arXiv Detail & Related papers (2026-01-06T09:58:14Z) - Quantum Visual Fields with Neural Amplitude Encoding [70.86293548779774]
We introduce a new type of Quantum Implicit Neural Representation (QINR) for 2D image and 3D geometric field learning.<n>QVF encodes classical data into quantum statevectors using neural amplitude encoding grounded in a learnable energy manifold.<n>Our ansatz follows a fully entangled design of learnable parametrised quantum circuits, with quantum (unitary) operations performed in the real Hilbert space.
arXiv Detail & Related papers (2025-08-14T17:59:52Z) - Quantum Convolutional Neural Network with Nonlinear Effects and Barren Plateau Mitigation [0.0]
Quantum neural networks (QNNs) leverage quantum entanglement and superposition to enable large-scale parallel linear computation.<n>However, their practical deployment is hampered by the lack of intrinsic nonlinear operations and the barren plateau phenomenon.<n>We propose a quantum neural convolutional network (QCNN) architecture that simultaneously addresses both issues.
arXiv Detail & Related papers (2025-08-04T14:26:48Z) - Quantum Recurrent Embedding Neural Network [11.54075064463256]
We propose a quantum recurrent embedding neural network (QRENN) inspired by fast-track information pathways in ResNet.<n>We provide a rigorous proof of the trainability of QRENN circuits, demonstrating that this deep quantum neural network can avoid barren plateaus.<n>Our results highlight the power of recurrent data embedding in quantum neural networks and the potential for scalable quantum supervised learning.
arXiv Detail & Related papers (2025-06-16T07:50:31Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Quantum Adaptive Self-Attention for Quantum Transformer Models [0.0]
We propose Quantum Adaptive Self-Attention (QASA), a novel hybrid architecture that enhances classical Transformer models with a quantum attention mechanism.<n>QASA replaces dot-product attention with a parameterized quantum circuit (PQC) that adaptively captures inter-token relationships in the quantum Hilbert space.<n> Experiments on synthetic time-series tasks demonstrate that QASA achieves faster convergence and superior generalization compared to both standard Transformers and reduced classical variants.
arXiv Detail & Related papers (2025-04-05T02:52:37Z) - Training Hybrid Deep Quantum Neural Network for Efficient Reinforcement Learning [3.753031740069576]
Quantum circuits embed data in a Hilbert space whose dimensionality grows exponentially with the number of qubits.<n>We introduce qtDNN, a tangential surrogate that locally approximates a quantum circuit.<n>We design hDQNN-TD3, a hybrid deep quantum neural network for continuous-control reinforcement learning.
arXiv Detail & Related papers (2025-03-12T07:12:02Z) - Memory-Augmented Hybrid Quantum Reservoir Computing [0.0]
We present a hybrid quantum-classical approach that implements memory through classical post-processing of quantum measurements.
We tested our model on two physical platforms: a fully connected Ising model and a Rydberg atom array.
arXiv Detail & Related papers (2024-09-15T22:44:09Z) - A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Quantum Generative Diffusion Model: A Fully Quantum-Mechanical Model for Generating Quantum State Ensemble [40.06696963935616]
We introduce Quantum Generative Diffusion Model (QGDM) as their simple and elegant quantum counterpart.
QGDM exhibits faster convergence than Quantum Generative Adversarial Network (QGAN)
It can achieve 53.02% higher fidelity in mixed-state generation than QGAN.
arXiv Detail & Related papers (2024-01-13T10:56:34Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - Quantum Annealing for Neural Network optimization problems: a new
approach via Tensor Network simulations [0.0]
Quantum Annealing (QA) is one of the most promising frameworks for quantum optimization.
We show that the adiabatic time evolution of QA can be efficiently represented as a suitable Network.
We show that the optimized state, expressed as a Matrix Product State (MPS), can be recast into a Quantum Circuit.
arXiv Detail & Related papers (2022-08-30T18:00:14Z) - Realizing Quantum Convolutional Neural Networks on a Superconducting
Quantum Processor to Recognize Quantum Phases [2.1465372441653354]
Quantum neural networks tailored to recognize specific features of quantum states by combining unitary operations, measurements and feedforward promise to require fewer measurements and to tolerate errors.
We realize a quantum convolutional neural network (QCNN) on a 7-qubit superconducting quantum processor to identify symmetry-protected topological phases of a spin model characterized by a non-zero string order parameter.
We find that, despite being composed of finite-fidelity gates itself, the QCNN recognizes the topological phase with higher fidelity than direct measurements of the string order parameter for the prepared states.
arXiv Detail & Related papers (2021-09-13T12:32:57Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.