One-Step Time Series Forecasting Using Variational Quantum Circuits
- URL: http://arxiv.org/abs/2207.07982v1
- Date: Sat, 16 Jul 2022 16:50:28 GMT
- Title: One-Step Time Series Forecasting Using Variational Quantum Circuits
- Authors: Payal Kaushik, Sayantan Pramanik, M Girish Chandra, C V Sridhar
- Abstract summary: Machine learning scientists define a time series as a set of observations recorded over consistent time steps.
Time is of great essence in this forecasting as it shows how the data coordinates over the dataset and the final result.
Quantum computers may prove to be a better option for perceiving the trends in the time series by exploiting quantum mechanical phenomena.
- Score: 1.1934558041641545
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Time series forecasting has always been a thought-provoking topic in the
field of machine learning. Machine learning scientists define a time series as
a set of observations recorded over consistent time steps. And, time series
forecasting is a way of analyzing the data and finding how variables change
over time and hence, predicting the future value. Time is of great essence in
this forecasting as it shows how the data coordinates over the dataset and the
final result. It also requires a large dataset to ascertain the regularity and
reliability. Quantum computers may prove to be a better option for perceiving
the trends in the time series by exploiting quantum mechanical phenomena like
superposition and entanglement. Here, we consider one-step time series
forecasting using variational quantum circuits, and record observations for
different datasets.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables [75.83318701911274]
TimeXer ingests external information to enhance the forecasting of endogenous variables.
TimeXer achieves consistent state-of-the-art performance on twelve real-world forecasting benchmarks.
arXiv Detail & Related papers (2024-02-29T11:54:35Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Respecting Time Series Properties Makes Deep Time Series Forecasting
Perfect [3.830797055092574]
How to handle time features shall be the core question of any time series forecasting model.
In this paper, we rigorously analyze three prevalent but deficient/unfounded deep time series forecasting mechanisms.
We propose a novel time series forecasting network, i.e. RTNet, on the basis of aforementioned analysis.
arXiv Detail & Related papers (2022-07-22T08:34:31Z) - A quantum generative model for multi-dimensional time series using
Hamiltonian learning [0.0]
We propose using the inherent nature of quantum computers to simulate quantum dynamics as a technique to encode such features.
We use the learned model to generate out-of-sample time series and show that it captures unique and complex features of the learned time series.
We experimentally demonstrate the proposed algorithm on an 11-qubit trapped-ion quantum machine.
arXiv Detail & Related papers (2022-04-13T03:06:36Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - AutoFITS: Automatic Feature Engineering for Irregular Time Series [0.44198435146063353]
In irregular time series, the time at which each observation is collected may be helpful to summarise the dynamics of the data and improve forecasting performance.
We develop a novel automatic feature engineering framework, which focuses on extracting information from this point of view when each instance is collected.
We study how valuable this information is by integrating it in a time series forecasting workflow and investigate how it compares to or complements state-of-the-art methods for regular time series forecasting.
arXiv Detail & Related papers (2021-12-29T19:42:48Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Instance-wise Graph-based Framework for Multivariate Time Series
Forecasting [69.38716332931986]
We propose a simple yet efficient instance-wise graph-based framework to utilize the inter-dependencies of different variables at different time stamps.
The key idea of our framework is aggregating information from the historical time series of different variables to the current time series that we need to forecast.
arXiv Detail & Related papers (2021-09-14T07:38:35Z) - Time Series is a Special Sequence: Forecasting with Sample Convolution
and Interaction [9.449017120452675]
Time series is a special type of sequence data, a set of observations collected at even intervals of time and ordered chronologically.
Existing deep learning techniques use generic sequence models for time series analysis, which ignore some of its unique properties.
We propose a novel neural network architecture and apply it for the time series forecasting problem, wherein we conduct sample convolution and interaction at multiple resolutions for temporal modeling.
arXiv Detail & Related papers (2021-06-17T08:15:04Z) - Deep Transformer Models for Time Series Forecasting: The Influenza
Prevalence Case [2.997238772148965]
Time series data are prevalent in many scientific and engineering disciplines.
We present a new approach to time series forecasting using Transformer-based machine learning models.
We show that the forecasting results produced by our approach are favorably comparable to the state-of-the-art.
arXiv Detail & Related papers (2020-01-23T00:22:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.