Entanglement for Pattern Learning in Temporal Data with Logarithmic Complexity: Benchmarking on IBM Quantum Hardware
- URL: http://arxiv.org/abs/2506.00097v1
- Date: Fri, 30 May 2025 12:16:08 GMT
- Title: Entanglement for Pattern Learning in Temporal Data with Logarithmic Complexity: Benchmarking on IBM Quantum Hardware
- Authors: Mostafizur Rahaman Laskar, Richa Goel,
- Abstract summary: Time series forecasting is foundational in scientific and technological domains, from climate modelling to molecular dynamics.<n>We propose a quantum-native time series forecasting framework that harnesses entanglement-based parameterized quantum circuits to learn temporal dependencies.<n>We benchmark QTS against classical models on synthetic and real-world datasets, including geopotential height fields used in numerical weather prediction.
- Score: 1.2277343096128712
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series forecasting is foundational in scientific and technological domains, from climate modelling to molecular dynamics. Classical approaches have significantly advanced sequential prediction, including autoregressive models and deep learning architectures such as temporal convolutional networks (TCNs) and Transformers. Yet, they remain resource-intensive and often scale poorly in data-limited or hardware-constrained settings. We propose a quantum-native time series forecasting framework that harnesses entanglement-based parameterized quantum circuits to learn temporal dependencies. Our Quantum Time Series (QTS) model encodes normalized sequential data into single-qubit rotations and embeds temporal structure through structured entanglement patterns. This design considers predictive performance with logarithmic complexity in training data and parameter count. We benchmark QTS against classical models on synthetic and real-world datasets, including geopotential height fields used in numerical weather prediction. Experiments on the noisy backend and real IBM quantum hardware demonstrate that QTS can capture temporal patterns using fewer data points. Hardware benchmarking results establish quantum entanglement as a practical computational resource for temporal modelling, with potential near-term applications in nano-scale systems, quantum sensor networks, and other forecasting scenarios.
Related papers
- Quantum Temporal Fusion Transformer [4.757470449749876]
Temporal Fusion Transformer (TFT) is a state-of-the-art attention-based deep neural network architecture specifically designed for multi-horizon time series forecasting.<n>We propose a Quantum Temporal Fusion Transformer (QTFT), a quantum-enhanced hybrid quantum-classical architecture that extends the capabilities of the classical framework.
arXiv Detail & Related papers (2025-08-06T03:21:20Z) - Quantum generative modeling for financial time series with temporal correlations [0.9636431845459937]
We investigate whether quantum correlations in quantum inspired models of QGANs can help in the generation of financial time series.<n>We train QGANs, composed of a quantum generator and a classical discriminator, and investigate two approaches for simulating the quantum generator.
arXiv Detail & Related papers (2025-07-29T17:36:49Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [55.09326865401653]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - Unraveling Quantum Environments: Transformer-Assisted Learning in Lindblad Dynamics [0.0]
We introduce a Transformer-based machine learning framework to infer time-dependent dissipation rates in quantum systems.<n>We demonstrate the effectiveness of our approach on a hierarchy of open quantum models of increasing complexity.<n>Our results suggest that modern machine learning tools can serve as scalable and data-driven alternatives for identifying unknown environments in open quantum systems.
arXiv Detail & Related papers (2025-05-11T10:18:19Z) - Quantum Kernel-Based Long Short-term Memory for Climate Time-Series Forecasting [0.24739484546803336]
We present the Quantum Kernel-Based Long short-memory (QK-LSTM) network, which integrates quantum kernel methods into classical LSTM architectures.<n>QK-LSTM captures intricate nonlinear dependencies and temporal dynamics with fewer trainable parameters.
arXiv Detail & Related papers (2024-12-12T01:16:52Z) - Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization [74.3339999119713]
We develop a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.<n>Our method first scales and decomposes the input time series, then thresholds and quantizes the wavelet coefficients, and finally pre-trains an autoregressive model to forecast coefficients for the forecast horizon.
arXiv Detail & Related papers (2024-12-06T18:22:59Z) - Generalizing Weather Forecast to Fine-grained Temporal Scales via Physics-AI Hybrid Modeling [55.13352174687475]
This paper proposes a physics-AI hybrid model (i.e., WeatherGFT) which generalizes weather forecasts to finer-grained temporal scales beyond training dataset.<n>Specifically, we employ a carefully designed PDE kernel to simulate physical evolution on a small time scale.<n>We also introduce a lead time-aware training framework to promote the generalization of the model at different lead times.
arXiv Detail & Related papers (2024-05-22T16:21:02Z) - Learning to Program Variational Quantum Circuits with Fast Weights [3.6881738506505988]
This paper introduces the Quantum Fast Weight Programmers (QFWP) as a solution to the temporal or sequential learning challenge.
The proposed QFWP model achieves learning of temporal dependencies without necessitating the use of quantum recurrent neural networks.
Numerical simulations conducted in this study showcase the efficacy of the proposed QFWP model in both time-series prediction and RL tasks.
arXiv Detail & Related papers (2024-02-27T18:53:18Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - A quantum generative model for multi-dimensional time series using
Hamiltonian learning [0.0]
We propose using the inherent nature of quantum computers to simulate quantum dynamics as a technique to encode such features.
We use the learned model to generate out-of-sample time series and show that it captures unique and complex features of the learned time series.
We experimentally demonstrate the proposed algorithm on an 11-qubit trapped-ion quantum machine.
arXiv Detail & Related papers (2022-04-13T03:06:36Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.