Message Propagation Through Time: An Algorithm for Sequence Dependency
Retention in Time Series Modeling
- URL: http://arxiv.org/abs/2309.16882v1
- Date: Thu, 28 Sep 2023 22:38:18 GMT
- Title: Message Propagation Through Time: An Algorithm for Sequence Dependency
Retention in Time Series Modeling
- Authors: Shaoming Xu, Ankush Khandelwal, Arvind Renganathan, Vipin Kumar
- Abstract summary: This paper proposes the Message Propagation Through Time (MPTT) algorithm for time series modeling.
MPTT incorporates long temporal dependencies while preserving faster training times relative to the stateful solutions.
Experimental results demonstrate that MPTT outperforms seven strategies on four climate datasets.
- Score: 14.49997340857179
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series modeling, a crucial area in science, often encounters challenges
when training Machine Learning (ML) models like Recurrent Neural Networks
(RNNs) using the conventional mini-batch training strategy that assumes
independent and identically distributed (IID) samples and initializes RNNs with
zero hidden states. The IID assumption ignores temporal dependencies among
samples, resulting in poor performance. This paper proposes the Message
Propagation Through Time (MPTT) algorithm to effectively incorporate long
temporal dependencies while preserving faster training times relative to the
stateful solutions. MPTT utilizes two memory modules to asynchronously manage
initial hidden states for RNNs, fostering seamless information exchange between
samples and allowing diverse mini-batches throughout epochs. MPTT further
implements three policies to filter outdated and preserve essential information
in the hidden states to generate informative initial hidden states for RNNs,
facilitating robust training. Experimental results demonstrate that MPTT
outperforms seven strategies on four climate datasets with varying levels of
temporal dependencies.
Related papers
- TCGPN: Temporal-Correlation Graph Pre-trained Network for Stock Forecasting [1.864621482724548]
We propose a novel approach called the Temporal-Correlation Graph Pre-trained Network (TCGPN) to address these limitations.
TCGPN utilize Temporal-correlation fusion encoder to get a mixed representation and pre-training method with carefully designed temporal and correlation pre-training tasks.
Experiments are conducted on real stock market data sets CSI300 and CSI500 that exhibit minimal periodicity.
arXiv Detail & Related papers (2024-07-26T05:27:26Z) - State Sequences Prediction via Fourier Transform for Representation
Learning [111.82376793413746]
We propose State Sequences Prediction via Fourier Transform (SPF), a novel method for learning expressive representations efficiently.
We theoretically analyze the existence of structural information in state sequences, which is closely related to policy performance and signal regularity.
Experiments demonstrate that the proposed method outperforms several state-of-the-art algorithms in terms of both sample efficiency and performance.
arXiv Detail & Related papers (2023-10-24T14:47:02Z) - Fully-Connected Spatial-Temporal Graph for Multivariate Time-Series Data [50.84488941336865]
We propose a novel method called Fully- Spatial-Temporal Graph Neural Network (FC-STGNN)
For graph construction, we design a decay graph to connect sensors across all timestamps based on their temporal distances.
For graph convolution, we devise FC graph convolution with a moving-pooling GNN layer to effectively capture the ST dependencies for learning effective representations.
arXiv Detail & Related papers (2023-09-11T08:44:07Z) - Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series [26.77596449192451]
Irregularly sampled time series are ubiquitous in several application domains, leading to sparse, not fully-observed and non-aligned observations.
Standard sequential neural networks (RNNs) and convolutional neural networks (CNNs) consider regular spacing between observation times, posing significant challenges to irregular time series modeling.
We parameterize convolutional layers by employing time-explicitly irregular kernels.
arXiv Detail & Related papers (2023-08-06T21:10:30Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - Large Scale Time-Series Representation Learning via Simultaneous Low and
High Frequency Feature Bootstrapping [7.0064929761691745]
We propose a non-contrastive self-supervised learning approach efficiently captures low and high-frequency time-varying features.
Our method takes raw time series data as input and creates two different augmented views for two branches of the model.
To demonstrate the robustness of our model we performed extensive experiments and ablation studies on five real-world time-series datasets.
arXiv Detail & Related papers (2022-04-24T14:39:47Z) - Task-Synchronized Recurrent Neural Networks [0.0]
Recurrent Neural Networks (RNNs) traditionally involve ignoring the fact, feeding the time differences as additional inputs, or resampling the data.
We propose an elegant straightforward alternative approach where instead the RNN is in effect resampled in time to match the time of the data or the task at hand.
We confirm empirically that our models can effectively compensate for the time-non-uniformity of the data and demonstrate that they compare favorably to data resampling, classical RNN methods, and alternative RNN models.
arXiv Detail & Related papers (2022-04-11T15:27:40Z) - Recurrence-in-Recurrence Networks for Video Deblurring [58.49075799159015]
State-of-the-art video deblurring methods often adopt recurrent neural networks to model the temporal dependency between the frames.
In this paper, we propose recurrence-in-recurrence network architecture to cope with the limitations of short-ranged memory.
arXiv Detail & Related papers (2022-03-12T11:58:13Z) - Online learning of windmill time series using Long Short-term Cognitive
Networks [58.675240242609064]
The amount of data generated on windmill farms makes online learning the most viable strategy to follow.
We use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings.
Our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model.
arXiv Detail & Related papers (2021-07-01T13:13:24Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.