Universal randomised signatures for generative time series modelling
- URL: http://arxiv.org/abs/2406.10214v2
- Date: Fri, 6 Sep 2024 15:28:03 GMT
- Title: Universal randomised signatures for generative time series modelling
- Authors: Francesca Biagini, Lukas Gonon, Niklas Walter,
- Abstract summary: We employ randomised signature to introduce a generative model for financial time series data.
Specifically, we propose a novel Wasserstein-type distance based on discrete-time randomised signatures.
We then use our metric as the loss function in a non-adversarial generator model for synthetic time series data.
- Score: 1.8434042562191815
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Randomised signature has been proposed as a flexible and easily implementable alternative to the well-established path signature. In this article, we employ randomised signature to introduce a generative model for financial time series data in the spirit of reservoir computing. Specifically, we propose a novel Wasserstein-type distance based on discrete-time randomised signatures. This metric on the space of probability measures captures the distance between (conditional) distributions. Its use is justified by our novel universal approximation results for randomised signatures on the space of continuous functions taking the underlying path as an input. We then use our metric as the loss function in a non-adversarial generator model for synthetic time series data based on a reservoir neural stochastic differential equation. We compare the results of our model to benchmarks from the existing literature.
Related papers
- Distributional Diffusion Models with Scoring Rules [83.38210785728994]
Diffusion models generate high-quality synthetic data.
generating high-quality outputs requires many discretization steps.
We propose to accomplish sample generation by learning the posterior em distribution of clean data samples.
arXiv Detail & Related papers (2025-02-04T16:59:03Z) - Learning to Forget: Bayesian Time Series Forecasting using Recurrent Sparse Spectrum Signature Gaussian Processes [27.884145970863287]
Signature kernel is a kernel between time series of arbitrary length.
We propose a principled, data-driven approach by introducing a novel forgetting mechanism for signatures.
This allows the model to dynamically adapt its context length to focus on more recent information.
arXiv Detail & Related papers (2024-12-27T16:31:09Z) - Discrete Flow Matching [74.04153927689313]
We present a novel discrete flow paradigm designed specifically for generating discrete data.
Our approach is capable of generating high-quality discrete data in a non-autoregressive fashion.
arXiv Detail & Related papers (2024-07-22T12:33:27Z) - Diffusion Forcing: Next-token Prediction Meets Full-Sequence Diffusion [61.03681839276652]
Diffusion Forcing is a new training paradigm where a diffusion model is trained to denoise a set of tokens with independent per-token noise levels.
We apply Diffusion Forcing to sequence generative modeling by training a causal next-token prediction model to generate one or several future tokens.
arXiv Detail & Related papers (2024-07-01T15:43:25Z) - Gaussian processes based data augmentation and expected signature for
time series classification [0.0]
We propose a feature extraction model for time series built upon the expected signature.
One of the main features is that an optimal feature extraction is learnt through the supervised task that uses the model.
arXiv Detail & Related papers (2023-10-16T21:18:51Z) - Generative modeling for time series via Schr{\"o}dinger bridge [0.0]
We propose a novel generative model for time series based on Schr"dinger bridge (SB) approach.
This consists in the entropic via optimal transport between a reference probability measure on path space and a target measure consistent with the joint data distribution of the time series.
arXiv Detail & Related papers (2023-04-11T09:45:06Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Wasserstein multivariate auto-regressive models for modeling distributional time series [0.0]
This paper is focused on the statistical analysis of data consisting of a collection of multiple series of probability measures.
By modeling these time-dependent probability measures as random objects in the Wasserstein space, we propose a new auto-regressive model.
Results on the existence, uniqueness and stationarity of the solution of such a model are provided.
arXiv Detail & Related papers (2022-07-12T10:18:36Z) - Modeling Sequences as Distributions with Uncertainty for Sequential
Recommendation [63.77513071533095]
Most existing sequential methods assume users are deterministic.
Item-item transitions might fluctuate significantly in several item aspects and exhibit randomness of user interests.
We propose a Distribution-based Transformer Sequential Recommendation (DT4SR) which injects uncertainties into sequential modeling.
arXiv Detail & Related papers (2021-06-11T04:35:21Z) - Generative Semantic Hashing Enhanced via Boltzmann Machines [61.688380278649056]
Existing generative-hashing methods mostly assume a factorized form for the posterior distribution.
We propose to employ the distribution of Boltzmann machine as the retrievalal posterior.
We show that by effectively modeling correlations among different bits within a hash code, our model can achieve significant performance gains.
arXiv Detail & Related papers (2020-06-16T01:23:39Z) - Conditional Sig-Wasserstein GANs for Time Series Generation [8.593063679921109]
Generative adversarial networks (GANs) have been extremely successful in generating samples, from seemingly high dimensional probability measures.
These methods struggle to capture the temporal dependence of joint probability distributions induced by time-series data.
Long time-series data streams hugely increase the dimension of the target space, which may render generative modelling infeasible.
We propose a generic conditional Sig-WGAN framework by integrating Wasserstein-GANs with mathematically principled and efficient path feature extraction.
arXiv Detail & Related papers (2020-06-09T17:38:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.