An Analysis of Temporal Dropout in Earth Observation Time Series for Regression Tasks
- URL: http://arxiv.org/abs/2504.06915v1
- Date: Wed, 09 Apr 2025 14:23:04 GMT
- Title: An Analysis of Temporal Dropout in Earth Observation Time Series for Regression Tasks
- Authors: Miro Miranda, Francisco Mena, Andreas Dengel,
- Abstract summary: We introduce Monte Carlo Temporal Dropout (MC-TD), a method that explicitly accounts for input-level uncertainty by randomly dropping time-steps during inference.<n>We extend this approach with Monte Carlo Concrete Temporal Dropout (MC-ConcTD), a method that learns the optimal dropout distribution directly.<n>Experiments on three EO time-series datasets demonstrate that MC-ConcTD improves predictive performance and uncertainty calibration compared to existing approaches.
- Score: 4.707950656037167
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Missing instances in time series data impose a significant challenge to deep learning models, particularly in regression tasks. In the Earth Observation field, satellite failure or cloud occlusion frequently results in missing time-steps, introducing uncertainties in the predicted output and causing a decline in predictive performance. While many studies address missing time-steps through data augmentation to improve model robustness, the uncertainty arising at the input level is commonly overlooked. To address this gap, we introduce Monte Carlo Temporal Dropout (MC-TD), a method that explicitly accounts for input-level uncertainty by randomly dropping time-steps during inference using a predefined dropout ratio, thereby simulating the effect of missing data. To bypass the need for costly searches for the optimal dropout ratio, we extend this approach with Monte Carlo Concrete Temporal Dropout (MC-ConcTD), a method that learns the optimal dropout distribution directly. Both MC-TD and MC-ConcTD are applied during inference, leveraging Monte Carlo sampling for uncertainty quantification. Experiments on three EO time-series datasets demonstrate that MC-ConcTD improves predictive performance and uncertainty calibration compared to existing approaches. Additionally, we highlight the advantages of adaptive dropout tuning over manual selection, making uncertainty quantification more robust and accessible for EO applications.
Related papers
- Error-quantified Conformal Inference for Time Series [40.438171912992864]
Uncertainty quantification in time series prediction is challenging due to the temporal dependence and distribution shift on sequential data.
We propose itError-quantified Conformal Inference (ECI) by smoothing the quantile loss function.
ECI can achieve valid miscoverage control and output tighter prediction sets than other baselines.
arXiv Detail & Related papers (2025-02-02T15:02:36Z) - STTS-EAD: Improving Spatio-Temporal Learning Based Time Series Prediction via [7.247017092359663]
We propose STTS-EAD, an end-to-end method that seamlessly integrates anomaly into the training process of time series forecasting.<n>Our proposed STTS-EAD leveragestemporal information for forecasting and anomaly detection, with the two parts alternately executed and optimized for each other.<n>Our experiments show that our proposed method can effectively process anomalies detected in the training stage to improve forecasting performance in the inference stage and significantly outperform baselines.
arXiv Detail & Related papers (2025-01-14T03:26:05Z) - STAA: Spatio-Temporal Alignment Attention for Short-Term Precipitation Forecasting [9.177158814568887]
Short-term precipitation forecasting model based on ontemporal alignment, with SATA as the temporal alignment module, STAU as the temporal alignment feature extractor.
Based on satellite and ERA5 data, our model achieves improvements of 12.61% in terms of RMSE, in comparison with the state-of-the-art methods.
arXiv Detail & Related papers (2024-09-06T10:28:52Z) - Loss Shaping Constraints for Long-Term Time Series Forecasting [79.3533114027664]
We present a Constrained Learning approach for long-term time series forecasting that respects a user-defined upper bound on the loss at each time-step.
We propose a practical Primal-Dual algorithm to tackle it, and aims to demonstrate that it exhibits competitive average performance in time series benchmarks, while shaping the errors across the predicted window.
arXiv Detail & Related papers (2024-02-14T18:20:44Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - Better Batch for Deep Probabilistic Time Series Forecasting [15.31488551912888]
We propose an innovative training method that incorporates error autocorrelation to enhance probabilistic forecasting accuracy.
Our method constructs a mini-batch as a collection of $D$ consecutive time series segments for model training.
It explicitly learns a time-varying covariance matrix over each mini-batch, encoding error correlation among adjacent time steps.
arXiv Detail & Related papers (2023-05-26T15:36:59Z) - Monte Carlo EM for Deep Time Series Anomaly Detection [6.312089019297173]
Time series data are often corrupted by outliers or other kinds of anomalies.
Recent approaches to anomaly detection and forecasting assume that the proportion of anomalies in the training data is small enough to ignore.
We present a technique for augmenting existing time series models so that they explicitly account for anomalies in the training data.
arXiv Detail & Related papers (2021-12-29T07:52:36Z) - Neighborhood Spatial Aggregation MC Dropout for Efficient
Uncertainty-aware Semantic Segmentation in Point Clouds [8.98036662506975]
Uncertainty-aware semantic segmentation of point clouds includes the predictive uncertainty estimation and the uncertainty-guided model optimization.
The widely-used MC dropout establishes the distribution by computing the standard deviation of samples using multiple forward propagations.
A framework embedded with NSA-MC dropout, a variant of MC dropout, is proposed to establish distributions in just one forward pass.
arXiv Detail & Related papers (2021-12-05T02:22:32Z) - Imputation-Free Learning from Incomplete Observations [73.15386629370111]
We introduce the importance of guided gradient descent (IGSGD) method to train inference from inputs containing missing values without imputation.
We employ reinforcement learning (RL) to adjust the gradients used to train the models via back-propagation.
Our imputation-free predictions outperform the traditional two-step imputation-based predictions using state-of-the-art imputation methods.
arXiv Detail & Related papers (2021-07-05T12:44:39Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Targeted stochastic gradient Markov chain Monte Carlo for hidden Markov models with rare latent states [48.705095800341944]
Markov chain Monte Carlo (MCMC) algorithms for hidden Markov models often rely on the forward-backward sampler.
This makes them computationally slow as the length of the time series increases, motivating the development of sub-sampling-based approaches.
We propose a targeted sub-sampling approach that over-samples observations corresponding to rare latent states when calculating the gradient of parameters associated with them.
arXiv Detail & Related papers (2018-10-31T17:44:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.