Patch-wise Structural Loss for Time Series Forecasting
- URL: http://arxiv.org/abs/2503.00877v1
- Date: Sun, 02 Mar 2025 12:36:15 GMT
- Title: Patch-wise Structural Loss for Time Series Forecasting
- Authors: Dilfira Kudrat, Zongxia Xie, Yanru Sun, Tianyu Jia, Qinghua Hu,
- Abstract summary: Patch-wise Structural (PS) loss is designed to enhance structural alignment by comparing time series at the patch level.<n>PS loss captures nuanced structural discrepancies overlooked by traditional point-wise losses.<n>It integrates seamlessly with point-wise loss, simultaneously addressing local structural inconsistencies and individual time-step errors.
- Score: 25.291749176117662
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time-series forecasting has gained significant attention in machine learning due to its crucial role in various domains. However, most existing forecasting models rely heavily on point-wise loss functions like Mean Square Error, which treat each time step independently and neglect the structural dependencies inherent in time series data, making it challenging to capture complex temporal patterns accurately. To address these challenges, we propose a novel Patch-wise Structural (PS) loss, designed to enhance structural alignment by comparing time series at the patch level. Through leveraging local statistical properties, such as correlation, variance, and mean, PS loss captures nuanced structural discrepancies overlooked by traditional point-wise losses. Furthermore, it integrates seamlessly with point-wise loss, simultaneously addressing local structural inconsistencies and individual time-step errors. PS loss establishes a novel benchmark for accurately modeling complex time series data and provides a new perspective on time series loss function design. Extensive experiments demonstrate that PS loss significantly improves the performance of state-of-the-art models across diverse real-world datasets.
Related papers
- Non-Stationary Time Series Forecasting Based on Fourier Analysis and Cross Attention Mechanism [5.480591342227219]
This paper proposes a new framework, AEFIN, which enhances the information sharing ability between stable and unstable components.<n>We also design a new loss function that combines time-domain stability constraints, time-domain instability constraints, and frequency-domain stability constraints to improve the accuracy and robustness of forecasting.<n> Experimental results show that AEFIN outperforms the most common models in terms of mean square error and mean absolute error, especially under non-stationary data conditions.
arXiv Detail & Related papers (2025-05-11T09:34:36Z) - Powerformer: A Transformer with Weighted Causal Attention for Time-series Forecasting [50.298817606660826]
We introduce Powerformer, a novel Transformer variant that replaces noncausal attention weights with causal weights that are reweighted according to a smooth heavy-tailed decay.<n>Our empirical results demonstrate that Powerformer achieves state-of-the-art accuracy on public time-series benchmarks.<n>Our analyses show that the model's locality bias is amplified during training, demonstrating an interplay between time-series data and power-law-based attention.
arXiv Detail & Related papers (2025-02-10T04:42:11Z) - BRATI: Bidirectional Recurrent Attention for Time-Series Imputation [0.14999444543328289]
Missing data in time-series analysis poses significant challenges, affecting the reliability of downstream applications.<n>This paper introduces BRATI, a novel deep-learning model designed to address multivariate time-series imputation.<n>BRATI processes temporal dependencies and feature correlations across long and short time horizons, utilizing two imputation blocks that operate in opposite temporal directions.
arXiv Detail & Related papers (2025-01-09T17:50:56Z) - Temporal Feature Matters: A Framework for Diffusion Model Quantization [105.3033493564844]
Diffusion models rely on the time-step for the multi-round denoising.
We introduce a novel quantization framework that includes three strategies.
This framework preserves most of the temporal information and ensures high-quality end-to-end generation.
arXiv Detail & Related papers (2024-07-28T17:46:15Z) - Learning Graph Structures and Uncertainty for Accurate and Calibrated Time-series Forecasting [65.40983982856056]
We introduce STOIC, that leverages correlations between time-series to learn underlying structure between time-series and to provide well-calibrated and accurate forecasts.
Over a wide-range of benchmark datasets STOIC provides 16% more accurate and better-calibrated forecasts.
arXiv Detail & Related papers (2024-07-02T20:14:32Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - U-Mixer: An Unet-Mixer Architecture with Stationarity Correction for
Time Series Forecasting [11.55346291812749]
Non-stationarity in time series forecasting obstructs stable feature propagation through deep layers, disrupts feature distributions, and complicates learning data distribution changes.
We propose U-Mixer, which captures local temporal dependencies between different patches and channels separately.
We show that U-Mixer achieves 14.5% and 7.7% improvements over state-of-the-art (SOTA) methods.
arXiv Detail & Related papers (2024-01-04T12:41:40Z) - DRAformer: Differentially Reconstructed Attention Transformer for
Time-Series Forecasting [7.805077630467324]
Time-series forecasting plays an important role in many real-world scenarios, such as equipment life cycle forecasting, weather forecasting, and traffic flow forecasting.
It can be observed from recent research that a variety of transformer-based models have shown remarkable results in time-series forecasting.
However, there are still some issues that limit the ability of transformer-based models on time-series forecasting tasks.
arXiv Detail & Related papers (2022-06-11T10:34:29Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Time Series Domain Adaptation via Sparse Associative Structure Alignment [29.003081310633323]
We propose a novel sparse associative structure alignment model for domain adaptation.
First, we generate the segment set to exclude the obstacle of offsets.
Second, the intra-variables and inter-variables sparse attention mechanisms are devised to extract associative structure time-series data.
Third, the associative structure alignment is used to guide the transfer of knowledge from the source domain to the target one.
arXiv Detail & Related papers (2020-12-22T02:30:40Z) - Spatiotemporal Attention for Multivariate Time Series Prediction and
Interpretation [17.568599402858037]
temporal attention mechanism (STAM) for simultaneous learning of the most important time steps and variables.
Results: STAM maintains state-of-the-art prediction accuracy while offering the benefit of accurate interpretability.
arXiv Detail & Related papers (2020-08-11T17:34:55Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.