Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors
- URL: http://arxiv.org/abs/2305.18803v2
- Date: Wed, 18 Oct 2023 14:33:12 GMT
- Title: Koopa: Learning Non-stationary Time Series Dynamics with Koopman
Predictors
- Authors: Yong Liu, Chenyu Li, Jianmin Wang, Mingsheng Long
- Abstract summary: Real-world time series are characterized by intrinsic non-stationarity that poses a principal challenge for deep forecasting models.
We tackle non-stationary time series with modern Koopman theory that fundamentally considers the underlying time-variant dynamics.
We propose Koopa as a novel Koopman forecaster composed of stackable blocks that learn hierarchical dynamics.
- Score: 85.22004745984253
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Real-world time series are characterized by intrinsic non-stationarity that
poses a principal challenge for deep forecasting models. While previous models
suffer from complicated series variations induced by changing temporal
distribution, we tackle non-stationary time series with modern Koopman theory
that fundamentally considers the underlying time-variant dynamics. Inspired by
Koopman theory of portraying complex dynamical systems, we disentangle
time-variant and time-invariant components from intricate non-stationary series
by Fourier Filter and design Koopman Predictor to advance respective dynamics
forward. Technically, we propose Koopa as a novel Koopman forecaster composed
of stackable blocks that learn hierarchical dynamics. Koopa seeks measurement
functions for Koopman embedding and utilizes Koopman operators as linear
portraits of implicit transition. To cope with time-variant dynamics that
exhibits strong locality, Koopa calculates context-aware operators in the
temporal neighborhood and is able to utilize incoming ground truth to scale up
forecast horizon. Besides, by integrating Koopman Predictors into deep residual
structure, we ravel out the binding reconstruction loss in previous Koopman
forecasters and achieve end-to-end forecasting objective optimization. Compared
with the state-of-the-art model, Koopa achieves competitive performance while
saving 77.3% training time and 76.0% memory.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Temporally-Consistent Koopman Autoencoders for Forecasting Dynamical Systems [42.6886113798806]
We introduce the Temporally-Consistent Koopman Autoencoder (tcKAE)
tcKAE generates accurate long-term predictions even with constrained and noisy training data.
We demonstrate tcKAE's superior performance over state-of-the-art KAE models across a variety of test cases.
arXiv Detail & Related papers (2024-03-19T00:48:25Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Koopman Learning with Episodic Memory [9.841748637412596]
We equip Koopman methods - developed for predicting non-autonomous time-series - with an episodic memory mechanism.
We find that a basic implementation of Koopman learning with episodic memory leads to significant improvements in prediction on synthetic and real-world data.
arXiv Detail & Related papers (2023-11-21T13:59:00Z) - Time-series Generation by Contrastive Imitation [87.51882102248395]
We study a generative framework that seeks to combine the strengths of both: Motivated by a moment-matching objective to mitigate compounding error, we optimize a local (but forward-looking) transition policy.
At inference, the learned policy serves as the generator for iterative sampling, and the learned energy serves as a trajectory-level measure for evaluating sample quality.
arXiv Detail & Related papers (2023-11-02T16:45:25Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Koopman Neural Forecaster for Time Series with Temporal Distribution
Shifts [26.95428146824254]
We propose a novel deep sequence model based on the Koopman theory for time series forecasting.
Koopman Neural Forecaster (KNF) learns the linear Koopman space and the coefficients of chosen measurement functions.
We demonstrate that KNF achieves the superior performance compared to the alternatives, on multiple time series datasets.
arXiv Detail & Related papers (2022-10-07T16:33:50Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.