Meta-Learning for Quantum Optimization via Quantum Sequence Model
- URL: http://arxiv.org/abs/2512.05058v1
- Date: Thu, 04 Dec 2025 18:13:45 GMT
- Title: Meta-Learning for Quantum Optimization via Quantum Sequence Model
- Authors: Yu-Cheng Lin, Yu-Chao Hsu, Samuel Yen-Chi Chen,
- Abstract summary: We show that QK-LSTM achieves superior performance, obtaining the highest approximation ratios and exhibiting the fastest rate of learning to learn to 13.<n>This, underscores the power of the kernel and underscores its effectiveness.
- Score: 14.766654275662589
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The Quantum Approximate Optimization Algorithm (QAOA) is a leading approach for solving combinatorial optimization problems on near-term quantum processors. However, finding good variational parameters remains a significant challenge due to the non-convex energy landscape, often resulting in slow convergence and poor solution quality. In this work, we propose a quantum meta-learning framework that trains advanced quantum sequence models to generate effective parameter initialization policies. We investigate four classical or quantum sequence models, including the Quantum Kernel-based Long Short-Term Memory (QK-LSTM), as learned optimizers in a "learning to learn" paradigm. Our numerical experiments on the Max-Cut problem demonstrate that the QK-LSTM optimizer achieves superior performance, obtaining the highest approximation ratios and exhibiting the fastest convergence rate across all tested problem sizes (n=10 to 13). Crucially, the QK-LSTM model achieves perfect parameter transferability by synthesizing a single, fixed set of near-optimal parameters, leading to a remarkable sustained acceleration of convergence even when generalizing to larger problems. This capability, enabled by the compact and expressive power of the quantum kernel architecture, underscores its effectiveness. The QK-LSTM, with only 43 trainable parameters, substantially outperforms the classical LSTM (56 parameters) and other quantum sequence models, establishing a robust pathway toward highly efficient parameter initialization for variational quantum algorithms in the NISQ era.
Related papers
- Quantum Approximate Optimization Algorithm with Fixed Number of Parameters [0.0]
We introduce a novel quantum optimization paradigm: the Fixed--Count Approximate Quantum Optimization Algorithm (FPC-QAOA)<n>It is a scalable variational framework that maintains a constant number of trainable parameters regardless of the number of qubits, Hamiltonian complexity, or circuit depth.<n>We benchmark FPC-QAOA on random MaxCut instances and the Tail Assignment Problem, achieving performance comparable to or better than standard QAOA.
arXiv Detail & Related papers (2025-12-24T14:02:31Z) - Quantum Approximate Optimization Algorithm for MIMO with Quantized b-bit Beamforming [47.98440449939344]
Multiple-input multiple-output (MIMO) is critical for 6G communication, offering improved spectral efficiency and reliability.<n>This paper explores the use of the Quantum Approximate Optimization Algorithm (QAOA) and alternating optimization to address the problem of b-bit quantized phase shifters both at the transmitter and the receiver.<n>We demonstrate that the structure of this quantized beamforming problem aligns naturally with hybrid-classical methods like QAOA, as the phase shifts used in beamforming can be directly mapped to rotation gates in a quantum circuit.
arXiv Detail & Related papers (2025-10-07T17:53:02Z) - TensorHyper-VQC: A Tensor-Train-Guided Hypernetwork for Robust and Scalable Variational Quantum Computing [50.95799256262098]
We introduceHyper-VQC, a novel tensor-train (TT)-guided hypernetwork framework for quantum machine learning.<n>Our framework delegates the generation of quantum circuit parameters to a classical TT network, effectively decoupling optimization from quantum hardware.<n>These results positionHyper-VQC as a scalable and noise-resilient framework for advancing practical quantum machine learning on near-term devices.
arXiv Detail & Related papers (2025-08-01T23:37:55Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Provably Robust Training of Quantum Circuit Classifiers Against Parameter Noise [49.97673761305336]
Noise remains a major obstacle to achieving reliable quantum algorithms.<n>We present a provably noise-resilient training theory and algorithm to enhance the robustness of parameterized quantum circuit classifiers.
arXiv Detail & Related papers (2025-05-24T02:51:34Z) - Accelerating Parameter Initialization in Quantum Chemical Simulations via LSTM-FC-VQE [5.396660696277483]
We use Long Short-Term Memory neural networks to speed up quantum chemical simulations.<n>By training the LSTM on optimized parameters from small molecules, the model learns to predict high-quality initializations for larger systems.
arXiv Detail & Related papers (2025-05-16T04:19:00Z) - Learning to Learn with Quantum Optimization via Quantum Neural Networks [1.7819574476785418]
We introduce a quantum meta-learning framework that combines quantum neural networks, specifically Quantum Long Short-Term Memory (QLSTM)<n>Our approach rapidly generalizes to larger, more complex problems, substantially reducing the number of iterations required for convergence.
arXiv Detail & Related papers (2025-05-01T14:39:26Z) - Q-MAML: Quantum Model-Agnostic Meta-Learning for Variational Quantum Algorithms [4.525216077859531]
We introduce a new framework for optimizing parameterized quantum circuits (PQCs) that employs a classical, inspired by Model-Agnostic Meta-Learning (MAML) technique.<n>Our framework features a classical neural network, called Learner, which interacts with a PQC using the output of Learner as an initial parameter.<n>In the adaptation phase, the framework requires only a few PQC updates to converge to a more accurate value, while the learner remains unchanged.
arXiv Detail & Related papers (2025-01-10T12:07:00Z) - ML-QLS: Multilevel Quantum Layout Synthesis [6.706813469929441]
We present ML-QLS, the first multilevel quantum layout tool with a scalable refinement operation integrated with novel cost functions and clustering strategies.<n>Our experimental results demonstrate that ML-QLS can scale up to problems involving hundreds of qubits and achieve a remarkable 52% performance improvement over leading QLS tools for large circuits.
arXiv Detail & Related papers (2024-05-28T17:10:20Z) - A self-consistent field approach for the variational quantum
eigensolver: orbital optimization goes adaptive [52.77024349608834]
We present a self consistent field approach (SCF) within the Adaptive Derivative-Assembled Problem-Assembled Ansatz Variational Eigensolver (ADAPTVQE)
This framework is used for efficient quantum simulations of chemical systems on nearterm quantum computers.
arXiv Detail & Related papers (2022-12-21T23:15:17Z) - Variational Quantum Optimization with Multi-Basis Encodings [62.72309460291971]
We introduce a new variational quantum algorithm that benefits from two innovations: multi-basis graph complexity and nonlinear activation functions.
Our results in increased optimization performance, two increase in effective landscapes and a reduction in measurement progress.
arXiv Detail & Related papers (2021-06-24T20:16:02Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.