PRNet: A Periodic Residual Learning Network for Crowd Flow Forecasting
- URL: http://arxiv.org/abs/2112.06132v1
- Date: Wed, 8 Dec 2021 12:04:27 GMT
- Title: PRNet: A Periodic Residual Learning Network for Crowd Flow Forecasting
- Authors: Chengxin Wang, Yuxuan Liang and Gary Tan
- Abstract summary: We devise a novel periodic residual learning network (PRNet) for better modeling the periodicity in crowd flow data.
PRNet frames the crowd flow forecasting as a periodic residual learning problem by modeling the deviation between the input (the previous time period) and the output (the future time period)
Experimental results on two real-world datasets demonstrate that PRNet outperforms the state-of-the-art methods in terms of both accuracy and robustness.
- Score: 8.50942649992681
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Crowd flow forecasting, e.g., predicting the crowds entering or leaving
certain regions, is of great importance to real-world urban applications. One
of the key properties of crowd flow data is periodicity: a pattern that occurs
at regular time intervals, such as a weekly pattern. To capture such
periodicity, existing studies either explicitly model it based on the periodic
hidden states or implicitly learn it by feeding all periodic segments into
neural networks. In this paper, we devise a novel periodic residual learning
network (PRNet) for better modeling the periodicity in crowd flow data.
Differing from existing methods, PRNet frames the crowd flow forecasting as a
periodic residual learning problem by modeling the deviation between the input
(the previous time period) and the output (the future time period). As compared
to predicting highly dynamic crowd flows directly, learning such stationary
deviation is much easier, which thus facilitates the model training. Besides,
the learned deviation enables the network to produce the residual between
future conditions and its corresponding weekly observations at each time
interval, and therefore contributes to substantially better predictions. We
further propose a lightweight Spatial-Channel Enhanced Encoder to build more
powerful region representations, by jointly capturing global spatial
correlations and temporal dependencies. Experimental results on two real-world
datasets demonstrate that PRNet outperforms the state-of-the-art methods in
terms of both accuracy and robustness.
Related papers
- Interpretable Short-Term Load Forecasting via Multi-Scale Temporal
Decomposition [3.080999981940039]
This paper proposes an interpretable deep learning method, which learns a linear combination of neural networks that each attends to an input time feature.
Case studies have been carried out on the Belgium central grid load dataset and the proposed model demonstrated better accuracy compared to the frequently applied baseline model.
arXiv Detail & Related papers (2024-02-18T17:55:59Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Prompting-based Temporal Domain Generalization [10.377683220196873]
This paper presents a novel prompting-based approach to temporal domain generalization.
Our method adapts a trained model to temporal drift by learning global prompts, domain-specific prompts, and drift-aware prompts.
Experiments on classification, regression, and time series forecasting tasks demonstrate the generality of the proposed approach.
arXiv Detail & Related papers (2023-10-03T22:40:56Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Wild-Time: A Benchmark of in-the-Wild Distribution Shift over Time [69.77704012415845]
Temporal shifts can considerably degrade performance of machine learning models deployed in the real world.
We benchmark 13 prior approaches, including methods in domain generalization, continual learning, self-supervised learning, and ensemble learning.
Under both evaluation strategies, we observe an average performance drop of 20% from in-distribution to out-of-distribution data.
arXiv Detail & Related papers (2022-11-25T17:07:53Z) - Building Autocorrelation-Aware Representations for Fine-Scale
Spatiotemporal Prediction [1.2862507359003323]
We present a novel deep learning architecture that incorporates theories of spatial statistics into neural networks.
DeepLATTE contains an autocorrelation-guided semi-supervised learning strategy to enforce both local autocorrelation patterns and global autocorrelation trends.
We conduct a demonstration of DeepLATTE using publicly available data for an important public health topic, air quality prediction in a well-fitting, complex physical environment.
arXiv Detail & Related papers (2021-12-10T03:21:19Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Network Classifiers Based on Social Learning [71.86764107527812]
We propose a new way of combining independently trained classifiers over space and time.
The proposed architecture is able to improve prediction performance over time with unlabeled data.
We show that this strategy results in consistent learning with high probability, and it yields a robust structure against poorly trained classifiers.
arXiv Detail & Related papers (2020-10-23T11:18:20Z) - Improved Predictive Deep Temporal Neural Networks with Trend Filtering [22.352437268596674]
We propose a new prediction framework based on deep neural networks and a trend filtering.
We reveal that the predictive performance of deep temporal neural networks improves when the training data is temporally processed by a trend filtering.
arXiv Detail & Related papers (2020-10-16T08:29:36Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Conditional Mutual information-based Contrastive Loss for Financial Time
Series Forecasting [12.0855096102517]
We present a representation learning framework for financial time series forecasting.
In this paper, we propose to first learn compact representations from time series data, then use the learned representations to train a simpler model for predicting time series movements.
arXiv Detail & Related papers (2020-02-18T15:24:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.