Autoregressive with Slack Time Series Model for Forecasting a
Partially-Observed Dynamical Time Series
- URL: http://arxiv.org/abs/2306.16593v2
- Date: Fri, 9 Feb 2024 15:11:24 GMT
- Title: Autoregressive with Slack Time Series Model for Forecasting a
Partially-Observed Dynamical Time Series
- Authors: Akifumi Okuno, Yuya Morishita, Yoh-ichi Mototake
- Abstract summary: We introduce the autoregressive with slack time series (ARS) model, that simultaneously estimates the evolution function and imputes missing variables as a slack time series.
From a theoretical perspective, we prove that a 2-dimensional time-invariant and linear system can be reconstructed by utilizing observations from a single, partially observed dimension of the system.
- Score: 3.0232957374216953
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study delves into the domain of dynamical systems, specifically the
forecasting of dynamical time series defined through an evolution function.
Traditional approaches in this area predict the future behavior of dynamical
systems by inferring the evolution function. However, these methods may
confront obstacles due to the presence of missing variables, which are usually
attributed to challenges in measurement and a partial understanding of the
system of interest. To overcome this obstacle, we introduce the autoregressive
with slack time series (ARS) model, that simultaneously estimates the evolution
function and imputes missing variables as a slack time series. Assuming
time-invariance and linearity in the (underlying) entire dynamical time series,
our experiments demonstrate the ARS model's capability to forecast future time
series. From a theoretical perspective, we prove that a 2-dimensional
time-invariant and linear system can be reconstructed by utilizing observations
from a single, partially observed dimension of the system.
Related papers
- A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - Deep learning delay coordinate dynamics for chaotic attractors from
partial observable data [0.0]
We utilize deep artificial neural networks to learn discrete discrete time maps and continuous time flows of the partial state.
We demonstrate the capacity of deep ANNs to predict chaotic behavior from a scalar observation on a manifold of dimension three via the Lorenz system.
arXiv Detail & Related papers (2022-11-20T19:25:02Z) - Time Series Forecasting with Ensembled Stochastic Differential Equations
Driven by L\'evy Noise [2.3076895420652965]
We use a collection of SDEs equipped with neural networks to predict long-term trend of noisy time series.
Our contributions are, first, we use the phase space reconstruction method to extract intrinsic dimension of the time series data.
Second, we explore SDEs driven by $alpha$-stable L'evy motion to model the time series data and solve the problem through neural network approximation.
arXiv Detail & Related papers (2021-11-25T16:49:01Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.