Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling
- URL: http://arxiv.org/abs/2006.11381v1
- Date: Fri, 19 Jun 2020 21:04:47 GMT
- Title: Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling
- Authors: Lucas Pagliosa, Alexandru Telea, Rodrigo Mello
- Abstract summary: We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
- Score: 68.8204255655161
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The reconstruction of phase spaces is an essential step to analyze time
series according to Dynamical System concepts. A regression performed on such
spaces unveils the relationships among system states from which we can derive
their generating rules, that is, the most probable set of functions responsible
for generating observations along time. In this sense, most approaches rely on
Takens' embedding theorem to unfold the phase space, which requires the
embedding dimension and the time delay. Moreover, although several methods have
been proposed to empirically estimate those parameters, they still face
limitations due to their lack of consistency and robustness, which has
motivated this paper. As an alternative, we here propose an artificial neural
network with a forgetting mechanism to implicitly learn the phase spaces
properties, whatever they are. Such network trains on forecasting errors and,
after converging, its architecture is used to estimate the embedding
parameters. Experimental results confirm that our approach is either as
competitive as or better than most state-of-the-art strategies while revealing
the temporal relationship among time-series observations.
Related papers
- Cross Space and Time: A Spatio-Temporal Unitized Model for Traffic Flow Forecasting [16.782154479264126]
Predicting backbone-temporal traffic flow presents challenges due to complex interactions between temporal factors.
Existing approaches address these dimensions in isolation, neglecting their critical interdependencies.
In this paper, we introduce Sanonymous-Temporal Unitized Unitized Cell (ASTUC), a unified framework designed to capture both spatial and temporal dependencies.
arXiv Detail & Related papers (2024-11-14T07:34:31Z) - Expand and Compress: Exploring Tuning Principles for Continual Spatio-Temporal Graph Forecasting [17.530885640317372]
We propose a novel prompt tuning-based continuous forecasting method.
Specifically, we integrate the base-temporal graph neural network with a continuous prompt pool stored in memory.
This method ensures that the model sequentially learns from the widespread-temporal data stream to accomplish tasks for corresponding periods.
arXiv Detail & Related papers (2024-10-16T14:12:11Z) - Causal Temporal Representation Learning with Nonstationary Sparse Transition [22.6420431022419]
Causal Temporal Representation Learning (Ctrl) methods aim to identify the temporal causal dynamics of complex nonstationary temporal sequences.
This work adopts a sparse transition assumption, aligned with intuitive human understanding, and presents identifiability results from a theoretical perspective.
We introduce a novel framework, Causal Temporal Representation Learning with Nonstationary Sparse Transition (CtrlNS), designed to leverage the constraints on transition sparsity.
arXiv Detail & Related papers (2024-09-05T00:38:27Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Revisiting the Temporal Modeling in Spatio-Temporal Predictive Learning
under A Unified View [73.73667848619343]
We introduce USTEP (Unified S-TEmporal Predictive learning), an innovative framework that reconciles the recurrent-based and recurrent-free methods by integrating both micro-temporal and macro-temporal scales.
arXiv Detail & Related papers (2023-10-09T16:17:42Z) - Graph-Survival: A Survival Analysis Framework for Machine Learning on
Temporal Networks [14.430635608400982]
We propose a framework for designing generative models for continuous time temporal networks.
We propose a fitting method for models within this framework, and an algorithm for simulating new temporal networks having desired properties.
arXiv Detail & Related papers (2022-03-14T16:40:57Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Causal Modeling with Stochastic Confounders [11.881081802491183]
This work extends causal inference with confounders.
We propose a new approach to variational estimation for causal inference based on a representer theorem with a random input space.
arXiv Detail & Related papers (2020-04-24T00:34:44Z) - Spatial-Temporal Transformer Networks for Traffic Flow Forecasting [74.76852538940746]
We propose a novel paradigm of Spatial-Temporal Transformer Networks (STTNs) to improve the accuracy of long-term traffic forecasting.
Specifically, we present a new variant of graph neural networks, named spatial transformer, by dynamically modeling directed spatial dependencies.
The proposed model enables fast and scalable training over a long range spatial-temporal dependencies.
arXiv Detail & Related papers (2020-01-09T10:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.