Application of time-series quantum generative model to financial data
- URL: http://arxiv.org/abs/2405.11795v1
- Date: Mon, 20 May 2024 05:29:45 GMT
- Title: Application of time-series quantum generative model to financial data
- Authors: Shun Okumura, Masayuki Ohzeki, Masaya Abe,
- Abstract summary: A time-series generative model was applied as a quantum generative model to actual financial data.
It was observed that fewer parameter values were required compared with the classical method.
- Score: 1.2289361708127877
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite proposing a quantum generative model for time series that successfully learns correlated series with multiple Brownian motions, the model has not been adapted and evaluated for financial problems. In this study, a time-series generative model was applied as a quantum generative model to actual financial data. Future data for two correlated time series were generated and compared with classical methods such as long short-term memory and vector autoregression. Furthermore, numerical experiments were performed to complete missing values. Based on the results, we evaluated the practical applications of the time-series quantum generation model. It was observed that fewer parameter values were required compared with the classical method. In addition, the quantum time-series generation model was feasible for both stationary and nonstationary data. These results suggest that several parameters can be applied to various types of time-series data.
Related papers
- Benchmarking Quantum Models for Time-series Forecasting [0.3806074545662052]
We compare classical and quantum models for time series forecasting.
Most of the quantum models were able to achieve comparable results.
Results serve as a useful point of comparison for the field of forecasting with quantum machine learning.
arXiv Detail & Related papers (2024-12-18T14:17:17Z) - Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Recent Trends in Modelling the Continuous Time Series using Deep Learning: A Survey [0.18434042562191813]
Continuous-time series is essential for different modern application areas, e.g. healthcare, automobile, energy, finance, Internet of things (IoT)
This paper has described the general problem domain of time series and reviewed the challenges of modelling the continuous time series.
arXiv Detail & Related papers (2024-09-13T14:19:44Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Parameterized quantum circuits as universal generative models for continuous multivariate distributions [1.118478900782898]
ized quantum circuits have been extensively used as the basis for machine learning models in regression, classification, and generative tasks.
In this work, we elucidate expectation value sampling-based models and prove the universality of such variational quantum algorithms.
Our results may help guide the design of future quantum circuits in generative modelling tasks.
arXiv Detail & Related papers (2024-02-15T10:08:31Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Parameterization of state duration in Hidden semi-Markov Models: an
application in electrocardiography [0.0]
We introduce a parametric model for time series pattern recognition and provide a maximum-likelihood estimation of its parameters.
An application on classification reveals the main strengths and weaknesses of each alternative.
arXiv Detail & Related papers (2022-11-17T11:51:35Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Deep Time Series Models for Scarce Data [8.673181404172963]
Time series data have grown at an explosive rate in numerous domains and have stimulated a surge of time series modeling research.
Data scarcity is a universal issue that occurs in a vast range of data analytics problems.
arXiv Detail & Related papers (2021-03-16T22:16:54Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.