TILDE-Q: A Transformation Invariant Loss Function for Time-Series
Forecasting
- URL: http://arxiv.org/abs/2210.15050v2
- Date: Wed, 13 Mar 2024 01:31:24 GMT
- Title: TILDE-Q: A Transformation Invariant Loss Function for Time-Series
Forecasting
- Authors: Hyunwook Lee, Chunggi Lee, Hongkyu Lim, Sungahn Ko
- Abstract summary: Time-series forecasting can address real-world problems across various domains, including energy, weather, traffic, and economy.
Time-series forecasting is a well-researched field, predicting complex temporal patterns such as sudden changes in sequential data still poses a challenge with current models.
We propose a novel, compact loss function called TILDEQ that considers not only amplitude and phase distortions but also allows models to capture the shape of time-series sequences.
- Score: 8.086595073181604
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time-series forecasting has gained increasing attention in the field of
artificial intelligence due to its potential to address real-world problems
across various domains, including energy, weather, traffic, and economy. While
time-series forecasting is a well-researched field, predicting complex temporal
patterns such as sudden changes in sequential data still poses a challenge with
current models. This difficulty stems from minimizing Lp norm distances as loss
functions, such as mean absolute error (MAE) or mean square error (MSE), which
are susceptible to both intricate temporal dynamics modeling and signal shape
capturing. Furthermore, these functions often cause models to behave aberrantly
and generate uncorrelated results with the original time-series. Consequently,
developing a shape-aware loss function that goes beyond mere point-wise
comparison is essential. In this paper, we examine the definition of shape and
distortions, which are crucial for shape-awareness in time-series forecasting,
and provide a design rationale for the shape-aware loss function. Based on our
design rationale, we propose a novel, compact loss function called TILDEQ
(Transformation Invariant Loss function with Distance EQuilibrium) that
considers not only amplitude and phase distortions but also allows models to
capture the shape of time-series sequences. Furthermore, TILDE-Q supports the
simultaneous modeling of periodic and nonperiodic temporal dynamics. We
evaluate the efficacy of TILDE-Q by conducting extensive experiments under both
periodic and nonperiodic conditions with various models ranging from naive to
state-of-the-art. The experimental results show that the models trained with
TILDE-Q surpass those trained with other metrics, such as MSE and DILATE, in
various real-world applications, including electricity, traffic, illness,
economics, weather, and electricity transformer temperature (ETT).
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model [11.281386703572842]
TimeDiT is a diffusion transformer model that combines temporal dependency learning with probabilistic sampling.
TimeDiT employs a unified masking mechanism to harmonize the training and inference process across diverse tasks.
Our systematic evaluation demonstrates TimeDiT's effectiveness both in fundamental tasks, i.e., forecasting and imputation, through zero-shot/fine-tuning.
arXiv Detail & Related papers (2024-09-03T22:31:57Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - ContiFormer: Continuous-Time Transformer for Irregular Time Series
Modeling [30.12824131306359]
Modeling continuous-time dynamics on irregular time series is critical to account for data evolution and correlations that occur continuously.
Traditional methods including recurrent neural networks or Transformer models leverage inductive bias via powerful neural architectures to capture complex patterns.
We propose ContiFormer that extends the relation modeling of vanilla Transformer to the continuous-time domain.
arXiv Detail & Related papers (2024-02-16T12:34:38Z) - Real-time Inference and Extrapolation via a Diffusion-inspired Temporal
Transformer Operator (DiTTO) [1.5728609542259502]
We propose an operator learning method to solve time-dependent partial differential equations (PDEs) continuously and with extrapolation in time without any temporal discretization.
The proposed method, named Diffusion-inspired Temporal Transformer Operator (DiTTO), is inspired by latent diffusion models and their conditioning mechanism.
We demonstrate its extrapolation capability on a climate problem by estimating the temperature around the globe for several years, and also in modeling hypersonic flows around a double-cone.
arXiv Detail & Related papers (2023-07-18T08:45:54Z) - A Neural PDE Solver with Temporal Stencil Modeling [44.97241931708181]
Recent Machine Learning (ML) models have shown new promises in capturing important dynamics in high-resolution signals.
This study shows that significant information is often lost in the low-resolution down-sampled features.
We propose a new approach, which combines the strengths of advanced time-series sequence modeling and state-of-the-art neural PDE solvers.
arXiv Detail & Related papers (2023-02-16T06:13:01Z) - Deep Convolutional Architectures for Extrapolative Forecast in
Time-dependent Flow Problems [0.0]
Deep learning techniques are employed to model the system dynamics for advection dominated problems.
These models take as input a sequence of high-fidelity vector solutions for consecutive time-steps obtained from the PDEs.
Non-intrusive reduced-order modelling techniques such as deep auto-encoder networks are utilized to compress the high-fidelity snapshots.
arXiv Detail & Related papers (2022-09-18T03:45:56Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.