Learning to Forget: Bayesian Time Series Forecasting using Recurrent Sparse Spectrum Signature Gaussian Processes
- URL: http://arxiv.org/abs/2412.19727v1
- Date: Fri, 27 Dec 2024 16:31:09 GMT
- Title: Learning to Forget: Bayesian Time Series Forecasting using Recurrent Sparse Spectrum Signature Gaussian Processes
- Authors: Csaba Tóth, Masaki Adachi, Michael A. Osborne, Harald Oberhauser,
- Abstract summary: Signature kernel is a kernel between time series of arbitrary length.
We propose a principled, data-driven approach by introducing a novel forgetting mechanism for signatures.
This allows the model to dynamically adapt its context length to focus on more recent information.
- Score: 27.884145970863287
- License:
- Abstract: The signature kernel is a kernel between time series of arbitrary length and comes with strong theoretical guarantees from stochastic analysis. It has found applications in machine learning such as covariance functions for Gaussian processes. A strength of the underlying signature features is that they provide a structured global description of a time series. However, this property can quickly become a curse when local information is essential and forgetting is required; so far this has only been addressed with ad-hoc methods such as slicing the time series into subsegments. To overcome this, we propose a principled, data-driven approach by introducing a novel forgetting mechanism for signatures. This allows the model to dynamically adapt its context length to focus on more recent information. To achieve this, we revisit the recently introduced Random Fourier Signature Features, and develop Random Fourier Decayed Signature Features (RFDSF) with Gaussian processes (GPs). This results in a Bayesian time series forecasting algorithm with variational inference, that offers a scalable probabilistic algorithm that processes and transforms a time series into a joint predictive distribution over time steps in one pass using recurrence. For example, processing a sequence of length $10^4$ steps in $\approx 10^{-2}$ seconds and in $< 1\text{GB}$ of GPU memory. We demonstrate that it outperforms other GP-based alternatives and competes with state-of-the-art probabilistic time series forecasting algorithms.
Related papers
- Inference-Time Alignment in Diffusion Models with Reward-Guided Generation: Tutorial and Review [59.856222854472605]
This tutorial provides an in-depth guide on inference-time guidance and alignment methods for optimizing downstream reward functions in diffusion models.
practical applications in fields such as biology often require sample generation that maximizes specific metrics.
We discuss (1) fine-tuning methods combined with inference-time techniques, (2) inference-time algorithms based on search algorithms such as Monte Carlo tree search, and (3) connections between inference-time algorithms in language models and diffusion models.
arXiv Detail & Related papers (2025-01-16T17:37:35Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Universal randomised signatures for generative time series modelling [1.8434042562191815]
We employ randomised signature to introduce a generative model for financial time series data.
Specifically, we propose a novel Wasserstein-type distance based on discrete-time randomised signatures.
We then use our metric as the loss function in a non-adversarial generator model for synthetic time series data.
arXiv Detail & Related papers (2024-06-14T17:49:29Z) - Random Fourier Signature Features [8.766411351797885]
algebras give rise to one of the most powerful measures of similarity for sequences of arbitrary length called the signature kernel.
Previous algorithms to compute the signature kernel scale quadratically in terms of the length and the number of the sequences.
We develop a random Fourier feature-based acceleration of the signature kernel acting on the inherently non-Euclidean domain of sequences.
arXiv Detail & Related papers (2023-11-20T22:08:17Z) - Gaussian processes based data augmentation and expected signature for
time series classification [0.0]
We propose a feature extraction model for time series built upon the expected signature.
One of the main features is that an optimal feature extraction is learnt through the supervised task that uses the model.
arXiv Detail & Related papers (2023-10-16T21:18:51Z) - BayOTIDE: Bayesian Online Multivariate Time series Imputation with functional decomposition [31.096125530322933]
In real-world scenarios like traffic and energy, massive time-series data with missing values and noises are widely observed, even sampled irregularly.
While many imputation methods have been proposed, most of them work with a local horizon.
Almost all methods assume the observations are sampled at regular time stamps, and fail to handle complex irregular sampled time series.
arXiv Detail & Related papers (2023-08-28T21:17:12Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Complex Event Forecasting with Prediction Suffix Trees: Extended
Technical Report [70.7321040534471]
Complex Event Recognition (CER) systems have become popular in the past two decades due to their ability to "instantly" detect patterns on real-time streams of events.
There is a lack of methods for forecasting when a pattern might occur before such an occurrence is actually detected by a CER engine.
We present a formal framework that attempts to address the issue of Complex Event Forecasting.
arXiv Detail & Related papers (2021-09-01T09:52:31Z) - A Polynomial Time Algorithm for Learning Halfspaces with Tsybakov Noise [55.45544638410858]
We study the problem of PAC learning homogeneous halfspaces in the presence of Tsybakov noise.
Our algorithm learns the true halfspace within any accuracy $epsilon$.
arXiv Detail & Related papers (2020-10-04T22:19:06Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Scalable Hybrid HMM with Gaussian Process Emission for Sequential
Time-series Data Clustering [13.845932997326571]
Hidden Markov Model (HMM) combined with Gaussian Process (GP) emission can be effectively used to estimate the hidden state.
This paper proposes a scalable learning method for HMM-GPSM.
arXiv Detail & Related papers (2020-01-07T07:28:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.