Rainfall-Runoff Prediction at Multiple Timescales with a Single Long
Short-Term Memory Network
- URL: http://arxiv.org/abs/2010.07921v1
- Date: Thu, 15 Oct 2020 17:52:16 GMT
- Title: Rainfall-Runoff Prediction at Multiple Timescales with a Single Long
Short-Term Memory Network
- Authors: Martin Gauch, Frederik Kratzert, Daniel Klotz, Grey Nearing, Jimmy
Lin, Sepp Hochreiter
- Abstract summary: Long Short-Term Memory Networks (LSTMs) have been applied to daily discharge prediction with remarkable success.
Many practical scenarios, however, require predictions at more granular timescales.
In this study, we propose two Multi-Timescale LSTM (MTS-LSTM) architectures that jointly predict multiple timescales within one model.
We test these models on 516 basins across the continental United States and benchmark against the US National Water Model.
- Score: 41.33870234564485
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Long Short-Term Memory Networks (LSTMs) have been applied to daily discharge
prediction with remarkable success. Many practical scenarios, however, require
predictions at more granular timescales. For instance, accurate prediction of
short but extreme flood peaks can make a life-saving difference, yet such peaks
may escape the coarse temporal resolution of daily predictions. Naively
training an LSTM on hourly data, however, entails very long input sequences
that make learning hard and computationally expensive. In this study, we
propose two Multi-Timescale LSTM (MTS-LSTM) architectures that jointly predict
multiple timescales within one model, as they process long-past inputs at a
single temporal resolution and branch out into each individual timescale for
more recent input steps. We test these models on 516 basins across the
continental United States and benchmark against the US National Water Model.
Compared to naive prediction with a distinct LSTM per timescale, the
multi-timescale architectures are computationally more efficient with no loss
in accuracy. Beyond prediction quality, the multi-timescale LSTM can process
different input variables at different timescales, which is especially relevant
to operational applications where the lead time of meteorological forcings
depends on their temporal resolution.
Related papers
- MixLinear: Extreme Low Resource Multivariate Time Series Forecasting with 0.1K Parameters [6.733646592789575]
Long-term Time Series Forecasting (LTSF) involves predicting long-term values by analyzing a large amount of historical time-series data to identify patterns and trends.
Transformer-based models offer high forecasting accuracy, but they are often too compute-intensive to be deployed on devices with hardware constraints.
We propose MixLinear, an ultra-lightweight time series forecasting model specifically designed for resource-constrained devices.
arXiv Detail & Related papers (2024-10-02T23:04:57Z) - Time Distributed Deep Learning models for Purely Exogenous Forecasting. Application to Water Table Depth Prediction using Weather Image Time Series [1.4436965372953483]
We propose two different Deep Learning models to predict the water table depth in the Grana-Maira (Piemonte, IT)
To deal with the image time series, both models are made of a first Time Distributed Convolutional Neural Network (TDC) which encodes the image available at each time step into a vectorial representation.
The two models have focused on different learnable information: TDC-LSTM has focused more on lowering the bias, while the TDC-UnPWaveNet has focused more on the temporal dynamics maximising correlation and KGE.
arXiv Detail & Related papers (2024-09-20T07:25:54Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - Recurrent Neural Networks with more flexible memory: better predictions
than rough volatility [0.0]
We compare the ability of vanilla and extended long short term memory networks to predict asset price volatility.
We show that the model with the smallest validation loss systemically outperforms rough volatility predictions by about 20% when trained and tested on a dataset with multiple time series.
arXiv Detail & Related papers (2023-08-04T14:24:57Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Long-Term Missing Value Imputation for Time Series Data Using Deep
Neural Networks [1.2019888796331233]
We present an approach that uses a deep learning model, in particular, a MultiLayer Perceptron (MLP) for estimating the missing values of a variable.
We focus on filling a long continuous gap rather than filling individual randomly missing observations.
Our approach enables the use of datasets that have a large gap in one variable, which is common in many long-term environmental monitoring observations.
arXiv Detail & Related papers (2022-02-25T00:29:30Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - LTN: Long-Term Network for Long-Term Motion Prediction [0.0]
We present a two-stage framework for long-term trajectory prediction, which is named as Long-Term Network (LTN)
We first generate a set of proposed trajectories with our proposed distribution using a Conditional Variational Autoencoder (CVAE) and then classify them with binary labels, and output the trajectories with the highest score.
The results show that our method outperforms multiple state-of-the-art approaches in long-term trajectory prediction in terms of accuracy.
arXiv Detail & Related papers (2020-10-15T17:59:09Z) - Multi-timescale Representation Learning in LSTM Language Models [69.98840820213937]
Language models must capture statistical dependencies between words at timescales ranging from very short to very long.
We derived a theory for how the memory gating mechanism in long short-term memory language models can capture power law decay.
Experiments showed that LSTM language models trained on natural English text learn to approximate this theoretical distribution.
arXiv Detail & Related papers (2020-09-27T02:13:38Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.