Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors
- URL: http://arxiv.org/abs/2305.18803v2
- Date: Wed, 18 Oct 2023 14:33:12 GMT
- Title: Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors
- Authors: Yong Liu, Chenyu Li, Jianmin Wang, Mingsheng Long
- Abstract summary: Real-world time series are characterized by intrinsic non-stationarity that poses a principal challenge for deep forecasting models.
We tackle non-stationary time series with modern Koopman theory that fundamentally considers the underlying time-variant dynamics.
We propose Koopa as a novel Koopman forecaster composed of stackable blocks that learn hierarchical dynamics.
- Score: 85.22004745984253
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Real-world time series are characterized by intrinsic non-stationarity that
poses a principal challenge for deep forecasting models. While previous models
suffer from complicated series variations induced by changing temporal
distribution, we tackle non-stationary time series with modern Koopman theory
that fundamentally considers the underlying time-variant dynamics. Inspired by
Koopman theory of portraying complex dynamical systems, we disentangle
time-variant and time-invariant components from intricate non-stationary series
by Fourier Filter and design Koopman Predictor to advance respective dynamics
forward. Technically, we propose Koopa as a novel Koopman forecaster composed
of stackable blocks that learn hierarchical dynamics. Koopa seeks measurement
functions for Koopman embedding and utilizes Koopman operators as linear
portraits of implicit transition. To cope with time-variant dynamics that
exhibits strong locality, Koopa calculates context-aware operators in the
temporal neighborhood and is able to utilize incoming ground truth to scale up
forecast horizon. Besides, by integrating Koopman Predictors into deep residual
structure, we ravel out the binding reconstruction loss in previous Koopman
forecasters and achieve end-to-end forecasting objective optimization. Compared
with the state-of-the-art model, Koopa achieves competitive performance while
saving 77.3% training time and 76.0% memory.
Related papers
- Sundial: A Family of Highly Capable Time Series Foundation Models [64.6322079384575]
We introduce Sundial, a family of native, flexible, and scalable time series foundation models.
Our model is pre-trained without specifying any prior distribution and can generate multiple probable predictions.
By mitigating mode collapse through TimeFlow Loss, we pre-train a family of Sundial models on TimeBench, which exhibit unprecedented model capacity and generalization performance.
arXiv Detail & Related papers (2025-02-02T14:52:50Z) - Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization [74.3339999119713]
We develop a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.
Our method first scales and decomposes the input time series, then thresholds and quantizes the wavelet coefficients, and finally pre-trains an autoregressive model to forecast coefficients for the forecast horizon.
arXiv Detail & Related papers (2024-12-06T18:22:59Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Temporally-Consistent Koopman Autoencoders for Forecasting Dynamical Systems [38.36312939874359]
We introduce the Temporally-Consistent Koopman Autoencoder (tcKAE)
tcKAE generates accurate long-term predictions even with limited and noisy training data.
We demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - Koopman Learning with Episodic Memory [9.841748637412596]
Koopman operator theory has found success in learning models of complex, real-world dynamical systems.
We equip Koopman methods with an episodic memory mechanism, enabling global recall of (or attention to) periods in time where similar dynamics previously occurred.
We find that a basic implementation of Koopman learning with episodic memory leads to significant improvements in prediction on synthetic and real-world data.
arXiv Detail & Related papers (2023-11-21T13:59:00Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Koopman Neural Forecaster for Time Series with Temporal Distribution
Shifts [26.95428146824254]
We propose a novel deep sequence model based on the Koopman theory for time series forecasting.
Koopman Neural Forecaster (KNF) learns the linear Koopman space and the coefficients of chosen measurement functions.
We demonstrate that KNF achieves the superior performance compared to the alternatives, on multiple time series datasets.
arXiv Detail & Related papers (2022-10-07T16:33:50Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.