Capabilities of Deep Learning Models on Learning Physical Relationships:
Case of Rainfall-Runoff Modeling with LSTM
- URL: http://arxiv.org/abs/2106.07963v1
- Date: Tue, 15 Jun 2021 08:36:16 GMT
- Title: Capabilities of Deep Learning Models on Learning Physical Relationships:
Case of Rainfall-Runoff Modeling with LSTM
- Authors: Kazuki Yokoo, Kei Ishida, Ali Ercan, Tongbi Tu, Takeyoshi Nagasato,
Masato Kiyama, and Motoki Amagasaki
- Abstract summary: This study investigates the relationships which deep learning methods can identify between the input and output data.
Daily precipitation and mean air temperature were used as model input to estimate daily flow discharge.
The results of this study indicated that a deep learning method may not properly learn the explicit physical relationships between input and target variables.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study investigates the relationships which deep learning methods can
identify between the input and output data. As a case study, rainfall-runoff
modeling in a snow-dominated watershed by means of a long- and short-term
memory (LSTM) network is selected. Daily precipitation and mean air temperature
were used as model input to estimate daily flow discharge. After model training
and verification, two experimental simulations were conducted with hypothetical
inputs instead of observed meteorological data to clarify the response of the
trained model to the inputs. The first numerical experiment showed that even
without input precipitation, the trained model generated flow discharge,
particularly winter low flow and high flow during the snow-melting period. The
effects of warmer and colder conditions on the flow discharge were also
replicated by the trained model without precipitation. Additionally, the model
reflected only 17-39% of the total precipitation mass during the snow
accumulation period in the total annual flow discharge, revealing a strong lack
of water mass conservation. The results of this study indicated that a deep
learning method may not properly learn the explicit physical relationships
between input and target variables, although they are still capable of
maintaining strong goodness-of-fit results.
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Provable Statistical Rates for Consistency Diffusion Models [87.28777947976573]
Despite the state-of-the-art performance, diffusion models are known for their slow sample generation due to the extensive number of steps involved.
This paper contributes towards the first statistical theory for consistency models, formulating their training as a distribution discrepancy minimization problem.
arXiv Detail & Related papers (2024-06-23T20:34:18Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Data Attribution for Diffusion Models: Timestep-induced Bias in Influence Estimation [53.27596811146316]
Diffusion models operate over a sequence of timesteps instead of instantaneous input-output relationships in previous contexts.
We present Diffusion-TracIn that incorporates this temporal dynamics and observe that samples' loss gradient norms are highly dependent on timestep.
We introduce Diffusion-ReTrac as a re-normalized adaptation that enables the retrieval of training samples more targeted to the test sample of interest.
arXiv Detail & Related papers (2024-01-17T07:58:18Z) - Physically Explainable Deep Learning for Convective Initiation
Nowcasting Using GOES-16 Satellite Observations [0.1874930567916036]
Convection initiation (CI) nowcasting remains a challenging problem for both numerical weather prediction models and existing nowcasting algorithms.
In this study, object-based probabilistic deep learning models are developed to predict CI based on multichannel infrared GOES-R satellite observations.
arXiv Detail & Related papers (2023-10-24T17:18:44Z) - DiffESM: Conditional Emulation of Earth System Models with Diffusion
Models [2.1989764549743476]
A key application of Earth System Models (ESMs) is studying extreme weather events, such as heat waves or dry spells.
We show that diffusion models can effectively emulate the trends of ESMs under previously unseen climate scenarios.
arXiv Detail & Related papers (2023-04-23T17:12:33Z) - An evaluation of deep learning models for predicting water depth
evolution in urban floods [59.31940764426359]
We compare different deep learning models for prediction of water depth at high spatial resolution.
Deep learning models are trained to reproduce the data simulated by the CADDIES cellular-automata flood model.
Our results show that the deep learning models present in general lower errors compared to the other methods.
arXiv Detail & Related papers (2023-02-20T16:08:54Z) - A Denoising Diffusion Model for Fluid Field Prediction [0.0]
We propose a novel denoising diffusion generative model for predicting nonlinear fluid fields named FluidDiff.
By performing a diffusion process, the model is able to learn a complex representation of the high-dimensional dynamic system.
Langevin sampling is used to generate predictions for the flow state under specified initial conditions.
arXiv Detail & Related papers (2023-01-27T11:30:40Z) - Deep learning for improved global precipitation in numerical weather
prediction systems [1.721029532201972]
We use the UNET architecture of a deep convolutional neural network with residual learning as a proof of concept to learn global data-driven models of precipitation.
The results are compared with the operational dynamical model used by the India Meteorological Department.
This study is a proof-of-concept showing that residual learning-based UNET can unravel physical relationships to target precipitation.
arXiv Detail & Related papers (2021-06-20T05:10:42Z) - Hybrid Physics and Deep Learning Model for Interpretable Vehicle State
Prediction [75.1213178617367]
We propose a hybrid approach combining deep learning and physical motion models.
We achieve interpretability by restricting the output range of the deep neural network as part of the hybrid model.
The results show that our hybrid model can improve model interpretability with no decrease in accuracy compared to existing deep learning approaches.
arXiv Detail & Related papers (2021-03-11T15:21:08Z) - TRU-NET: A Deep Learning Approach to High Resolution Prediction of
Rainfall [21.399707529966474]
We present TRU-NET, an encoder-decoder model featuring a novel 2D cross attention mechanism between contiguous convolutional-recurrent layers.
We use a conditional-continuous loss function to capture the zero-skewed %extreme event patterns of rainfall.
Experiments show that our model consistently attains lower RMSE and MAE scores than a DL model prevalent in short term precipitation prediction.
arXiv Detail & Related papers (2020-08-20T17:27:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.