A posteriori Trading-inspired Model-free Time Series Segmentation
- URL: http://arxiv.org/abs/1912.06708v2
- Date: Thu, 9 Nov 2023 09:25:03 GMT
- Title: A posteriori Trading-inspired Model-free Time Series Segmentation
- Authors: Mogens Graf Plessen
- Abstract summary: Proposed method is compared to a popular model-based bottom-up approach fitting piecewise affine models and to a state-of-the-art model-based top-down approach fitting Gaussian models.
Performance is demonstrated on synthetic and real-world data, including a large-scale dataset.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Within the context of multivariate time series segmentation this paper
proposes a method inspired by a posteriori optimal trading. After a
normalization step time series are treated channel-wise as surrogate stock
prices that can be traded optimally a posteriori in a virtual portfolio holding
either stock or cash. Linear transaction costs are interpreted as
hyperparameters for noise filtering. Resulting trading signals as well as
resulting trading signals obtained on the reversed time series are used for
unsupervised labeling, before a consensus over channels is reached that
determines segmentation time instants. The method is model-free such that no
model prescriptions for segments are made. Benefits of proposed approach
include simplicity, adaptability to a wide range of different shapes of time
series, and in particular computational efficiency that make it suitable for
big data. Performance is demonstrated on synthetic and real-world data,
including a large-scale dataset comprising a multivariate time series of
dimension 1000 and length 2709. Proposed method is compared to a popular
model-based bottom-up approach fitting piecewise affine models and to a
state-of-the-art model-based top-down approach fitting Gaussian models, and
found to be consistently faster while producing more intuitive results.
Related papers
- A Financial Time Series Denoiser Based on Diffusion Model [1.5193212081459284]
This paper introduces a novel approach utilizing the diffusion model as a denoiser for financial time series.
Trading signals derived from the denoised data yield more profitable trades with fewer transactions.
arXiv Detail & Related papers (2024-09-02T15:55:36Z) - Sample Enrichment via Temporary Operations on Subsequences for Sequential Recommendation [15.718287580146272]
We propose a novel model-agnostic and highly generic framework for sequential recommendation called sample enrichment via temporary operations on subsequences (SETO)
We highlight our SETO's effectiveness and versatility over multiple representative and state-of-the-art sequential recommendation models across multiple real-world datasets.
arXiv Detail & Related papers (2024-07-25T06:22:08Z) - Predictive Modeling in the Reservoir Kernel Motif Space [0.9217021281095907]
This work proposes a time series prediction method based on the kernel view of linear reservoirs.
We provide a geometric interpretation of our approach shedding light on how our approach is related to the core reservoir models.
Empirical experiments then compare predictive performances of our suggested model with those of recent state-of-art transformer based models.
arXiv Detail & Related papers (2024-05-11T16:12:25Z) - Align Your Steps: Optimizing Sampling Schedules in Diffusion Models [63.927438959502226]
Diffusion models (DMs) have established themselves as the state-of-the-art generative modeling approach in the visual domain and beyond.
A crucial drawback of DMs is their slow sampling speed, relying on many sequential function evaluations through large neural networks.
We propose a general and principled approach to optimizing the sampling schedules of DMs for high-quality outputs.
arXiv Detail & Related papers (2024-04-22T18:18:41Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TFMQ-DM: Temporal Feature Maintenance Quantization for Diffusion Models [52.454274602380124]
Diffusion models heavily depend on the time-step $t$ to achieve satisfactory multi-round denoising.
We propose a Temporal Feature Maintenance Quantization (TFMQ) framework building upon a Temporal Information Block.
Powered by the pioneering block design, we devise temporal information aware reconstruction (TIAR) and finite set calibration (FSC) to align the full-precision temporal features.
arXiv Detail & Related papers (2023-11-27T12:59:52Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Optimal Latent Space Forecasting for Large Collections of Short Time
Series Using Temporal Matrix Factorization [0.0]
It is a common practice to evaluate multiple methods and choose one of these methods or an ensemble for producing the best forecasts.
We propose a framework for forecasting short high-dimensional time series data by combining low-rank temporal matrix factorization and optimal model selection on latent time series.
arXiv Detail & Related papers (2021-12-15T11:39:21Z) - Visualising Deep Network's Time-Series Representations [93.73198973454944]
Despite the popularisation of machine learning models, more often than not they still operate as black boxes with no insight into what is happening inside the model.
In this paper, a method that addresses that issue is proposed, with a focus on visualising multi-dimensional time-series data.
Experiments on a high-frequency stock market dataset show that the method provides fast and discernible visualisations.
arXiv Detail & Related papers (2021-03-12T09:53:34Z) - Time-Series Imputation with Wasserstein Interpolation for Optimal
Look-Ahead-Bias and Variance Tradeoff [66.59869239999459]
In finance, imputation of missing returns may be applied prior to training a portfolio optimization model.
There is an inherent trade-off between the look-ahead-bias of using the full data set for imputation and the larger variance in the imputation from using only the training data.
We propose a Bayesian posterior consensus distribution which optimally controls the variance and look-ahead-bias trade-off in the imputation.
arXiv Detail & Related papers (2021-02-25T09:05:35Z) - Modeling Financial Time Series using LSTM with Trainable Initial Hidden
States [0.0]
We introduce a novel approach to modeling financial time series using a deep learning model.
We use a Long Short-Term Memory (LSTM) network equipped with the trainable initial hidden states.
arXiv Detail & Related papers (2020-07-14T06:36:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.