From Observations to States: Latent Time Series Forecasting
- URL: http://arxiv.org/abs/2602.00297v1
- Date: Fri, 30 Jan 2026 20:39:44 GMT
- Title: From Observations to States: Latent Time Series Forecasting
- Authors: Jie Yang, Yifan Hu, Yuante Li, Kexin Zhang, Kaize Ding, Philip S. Yu,
- Abstract summary: We propose Latent Time Series Forecasting (LatentTSF), a novel paradigm that shifts TSF from observation regression to latent state prediction.<n>Specifically, LatentTSF employs an AutoEncoder to project observations at each time step into a higher-dimensional latent state space.<n>Our proposed latent objectives implicitly maximize mutual information between predicted latent states and ground-truth states and observations.
- Score: 65.98504021691666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning has achieved strong performance in Time Series Forecasting (TSF). However, we identify a critical representation paradox, termed Latent Chaos: models with accurate predictions often learn latent representations that are temporally disordered and lack continuity. We attribute this phenomenon to the dominant observation-space forecasting paradigm. Most TSF models minimize point-wise errors on noisy and partially observed data, which encourages shortcut solutions instead of the recovery of underlying system dynamics. To address this issue, we propose Latent Time Series Forecasting (LatentTSF), a novel paradigm that shifts TSF from observation regression to latent state prediction. Specifically, LatentTSF employs an AutoEncoder to project observations at each time step into a higher-dimensional latent state space. This expanded representation aims to capture underlying system variables and impose a smoother temporal structure. Forecasting is then performed entirely in the latent space, allowing the model to focus on learning structured temporal dynamics. Theoretical analysis demonstrates that our proposed latent objectives implicitly maximize mutual information between predicted latent states and ground-truth states and observations. Extensive experiments on widely-used benchmarks confirm that LatentTSF effectively mitigates latent chaos, achieving superior performance. Our code is available in https://github.com/Muyiiiii/LatentTSF.
Related papers
- GTS: Inference-Time Scaling of Latent Reasoning with a Learnable Gaussian Thought Sampler [54.10960908347221]
We model latent thought exploration as conditional sampling from learnable densities and instantiate this idea as a Gaussian Thought Sampler (GTS)<n>GTS predicts context-dependent perturbation distributions over continuous reasoning states and is trained with GRPO-style policy optimization while keeping the backbone frozen.
arXiv Detail & Related papers (2026-02-15T09:57:47Z) - EIDOS: Latent-Space Predictive Learning for Time Series Foundation Models [37.917978019436674]
EIDOS is a foundation model family that shifts pretraining from future value prediction to latent-space predictive learning.<n>We train a causal Transformer to predict the evolution of latent representations, encouraging the emergence of structured and temporally coherent latent states.
arXiv Detail & Related papers (2026-02-15T07:07:20Z) - STLDM: Spatio-Temporal Latent Diffusion Model for Precipitation Nowcasting [22.29494904927957]
Precipitation nowcasting is a critical prediction task for society to prevent severe damage owing to extreme weather events.<n>We present a simple yet effective model architecture termed STLDM, a diffusion-based model that learns the latent representation from end to end alongside both the Varitemporal Autoencoder and the conditioning network.
arXiv Detail & Related papers (2025-12-24T11:34:44Z) - SynCast: Synergizing Contradictions in Precipitation Nowcasting via Diffusion Sequential Preference Optimization [62.958457694151384]
We introduce preference optimization into precipitation nowcasting for the first time, motivated by the success of reinforcement learning from human feedback in large language models.<n>In the first stage, the framework focuses on reducing FAR, training the model to effectively suppress false alarms.
arXiv Detail & Related papers (2025-10-22T16:11:22Z) - Dynamical Diffusion: Learning Temporal Dynamics with Diffusion Models [71.63194926457119]
We introduce Dynamical Diffusion (DyDiff), a theoretically sound framework that incorporates temporally aware forward and reverse processes.<n>Experiments across scientifictemporal forecasting, video prediction, and time series forecasting demonstrate that Dynamical Diffusion consistently improves performance in temporal predictive tasks.
arXiv Detail & Related papers (2025-03-02T16:10:32Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Neural Continuous-Discrete State Space Models for Irregularly-Sampled
Time Series [18.885471782270375]
NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables.
We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference.
Empirical results on multiple benchmark datasets show improved imputation and forecasting performance of NCDSSM over existing models.
arXiv Detail & Related papers (2023-01-26T18:45:04Z) - RNN with Particle Flow for Probabilistic Spatio-temporal Forecasting [30.277213545837924]
Many classical statistical models often fall short in handling the complexity and high non-linearity present in time-series data.
In this work, we consider the time-series data as a random realization from a nonlinear state-space model.
We use particle flow as the tool for approximating the posterior distribution of the states, as it is shown to be highly effective in complex, high-dimensional settings.
arXiv Detail & Related papers (2021-06-10T21:49:23Z) - Latent State Inference in a Spatiotemporal Generative Model [3.7525506486107267]
We focus on temperature and weather processes, including wave propagation dynamics, for which we assume that universal causes apply throughout space and time.
A recently introduced DIsed Stemporal graph graph artificial Neural Architecture (DISTANA) is used and enhanced to learn such processes.
We show that DISTANA, when combined with a retrospective latent state inference principle called active tuning, can reliably derive location-respective hidden causal factors.
arXiv Detail & Related papers (2020-09-21T12:59:40Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.