Interpretable Time Series Autoregression for Periodicity Quantification
- URL: http://arxiv.org/abs/2506.22895v2
- Date: Sun, 13 Jul 2025 21:53:21 GMT
- Title: Interpretable Time Series Autoregression for Periodicity Quantification
- Authors: Xinyu Chen, Vassilis Digalakis Jr, Lijun Ding, Dingyi Zhuang, Jinhua Zhao,
- Abstract summary: Time series autoregression (AR) is a classical tool for auto-correlations and periodic structures in real-world systems.<n>We revisit this model by introducing sparse autoregression (SAR), where $ell$-norm constraints are used to isolate dominant periodicities.<n>We validate our framework on large-scale mobility and climate time series.
- Score: 18.6300875919604
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series autoregression (AR) is a classical tool for modeling auto-correlations and periodic structures in real-world systems. We revisit this model from an interpretable machine learning perspective by introducing sparse autoregression (SAR), where $\ell_0$-norm constraints are used to isolate dominant periodicities. We formulate exact mixed-integer optimization (MIO) approaches for both stationary and non-stationary settings and introduce two scalable extensions: a decision variable pruning (DVP) strategy for temporally-varying SAR (TV-SAR), and a two-stage optimization scheme for spatially- and temporally-varying SAR (STV-SAR). These models enable scalable inference on real-world spatiotemporal datasets. We validate our framework on large-scale mobility and climate time series. On NYC ridesharing data, TV-SAR reveals interpretable daily and weekly cycles as well as long-term shifts due to COVID-19. On climate datasets, STV-SAR uncovers the evolving spatial structure of temperature and precipitation seasonality across four decades in North America and detects global sea surface temperature dynamics, including El Ni\~no. Together, our results demonstrate the interpretability, flexibility, and scalability of sparse autoregression for periodicity quantification in complex time series.
Related papers
- CirT: Global Subseasonal-to-Seasonal Forecasting with Geometry-inspired Transformer [47.65152457550307]
We propose the geometric-inspired Circular Transformer (CirT) to model the cyclic characteristic of the graticule.<n>Experiments on the Earth Reanalysis 5 (ERA5) reanalysis dataset demonstrate our model yields a significant improvement over the advanced data-driven models.
arXiv Detail & Related papers (2025-02-27T04:26:23Z) - Beyond Fixed Variables: Expanding-variate Time Series Forecasting via Flat Scheme and Spatio-temporal Focal Learning [9.205228068704141]
In real-world applications, Cyber-Physical Systems often expand as new sensors are, increasing variables in MTSF.<n>This task presents unique challenges, specifically (1) handling inconsistent data caused by adding new variables, and (2) addressing imbalanced-temporal learning.<n>To address these challenges, we propose STEV, a flexible-temporal forecasting framework.
arXiv Detail & Related papers (2025-02-21T08:43:26Z) - ClimateLLM: Efficient Weather Forecasting via Frequency-Aware Large Language Models [13.740208247043258]
We propose ClimateLLM, a foundation model for weather forecasting.<n>It captures temporal dependencies via a cross-temporal and cross-spatial collaborative framework.<n>It integrates frequency decomposition with Large Language Models to strengthen spatial and temporal modeling.
arXiv Detail & Related papers (2025-02-16T09:57:50Z) - A Generative Framework for Probabilistic, Spatiotemporally Coherent Downscaling of Climate Simulation [23.504915709396204]
We present a novel generative framework that uses a score-based diffusion model trained on high-resolution reanalysis data to capture the statistical properties of local weather dynamics.<n>We demonstrate that the model generates spatially and temporally coherent weather dynamics that align with global climate output.
arXiv Detail & Related papers (2024-12-19T19:47:35Z) - FATE: Focal-modulated Attention Encoder for Multivariate Time-series Forecasting [0.0]
Climate stands as one of the most pressing global challenges of the twenty-first century, with far-reaching consequences such as rising sea levels, melting glaciers, and increasingly extreme weather patterns.<n> Accurate forecasting is critical for monitoring these phenomena and supporting mitigation strategies.<n>Recent data-driven models for time-series forecasting, including CNNs, RNNs, and attention-based transformers, have shown promise but struggle with dependencies and limited parallelization.<n>In this work, we present Modulated Attention Focal (FATE) for reliable time-series forecasting.
arXiv Detail & Related papers (2024-08-21T04:40:18Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Flexible and efficient emulation of spatial extremes processes via variational autoencoders [9.09823450442456]
We integrate a new spatial extremes model that has flexible and non-stationary dependence properties in the encoding-decoding structure of a variational autoencoder called the XVAE.<n> XVAE can emulate spatial observations and produce outputs that have the same statistical properties as the inputs, especially in the tail.<n>We analyze a high-resolution satellite-derived dataset of sea surface temperature in the Red Sea, which includes 30 years of daily measurements at 16703 grid cells.
arXiv Detail & Related papers (2023-07-16T15:31:32Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Discovering Dynamic Patterns from Spatiotemporal Data with Time-Varying
Low-Rank Autoregression [12.923271427789267]
We develop a time-reduced-rank vector autoregression model whose coefficient are parameterized by low-rank tensor factorization.
In the temporal context, the complex time-varying system behaviors can be revealed by the temporal modes in the proposed model.
arXiv Detail & Related papers (2022-11-28T15:59:52Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Scalable Spatiotemporally Varying Coefficient Modelling with Bayesian Kernelized Tensor Regression [17.158289775348063]
Kernelized tensor Regression (BKTR) can be considered a new and scalable approach to modeling processes with low-rank cotemporal structure.
We conduct extensive experiments on both synthetic and real-world data sets, and our results confirm the superior performance and efficiency of BKTR for model estimation and inference.
arXiv Detail & Related papers (2021-08-31T19:22:23Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Deep Switching Auto-Regressive Factorization:Application to Time Series
Forecasting [16.934920617960085]
DSARF approximates high dimensional data by a product variables between time dependent weights and spatially dependent factors.
DSARF is different from the state-of-the-art techniques in that it parameterizes the weights in terms of a deep switching vector auto-regressive factorization.
Our experiments attest the superior performance of DSARF in terms of long- and short-term prediction error, when compared with the state-of-the-art methods.
arXiv Detail & Related papers (2020-09-10T20:15:59Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.