Hybrid SARIMA LSTM Model for Local Weather Forecasting: A Residual Learning Approach for Data Driven Meteorological Prediction
- URL: http://arxiv.org/abs/2601.07951v1
- Date: Mon, 12 Jan 2026 19:34:51 GMT
- Title: Hybrid SARIMA LSTM Model for Local Weather Forecasting: A Residual Learning Approach for Data Driven Meteorological Prediction
- Authors: Shreyas Rajeev, Karthik Mudenahalli Ashoka, Amit Mallappa Tiparaddi,
- Abstract summary: Accurately forecasting long-term atmospheric variables remains a defining challenge in meteorological science.<n>We propose a Hybrid SARIMA-LSTM architecture that employs a residual-learning strategy to decompose temperature into a predictable climate component and a nonlinear weather component.<n>This framework employs a residual-learning strategy to decompose temperature into a predictable climate component and a nonlinear weather component.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurately forecasting long-term atmospheric variables remains a defining challenge in meteorological science due to the chaotic nature of atmospheric systems. Temperature data represents a complex superposition of deterministic cyclical climate forces and stochastic, short-term fluctuations. While planetary mechanics drive predictable seasonal periodicities, rapid meteorological changes such as thermal variations, pressure anomalies, and humidity shifts introduce nonlinear volatilities that defy simple extrapolation. Historically, the Seasonal Autoregressive Integrated Moving Average (SARIMA) model has been the standard for modeling historical weather data, prized for capturing linear seasonal trends. However, SARIMA operates under strict assumptions of stationarity, failing to capture abrupt, nonlinear transitions. This leads to systematic residual errors, manifesting as the under-prediction of sudden spikes or the over-smoothing of declines. Conversely, Deep Learning paradigms, specifically Long Short-Term Memory (LSTM) networks, demonstrate exceptional efficacy in handling intricate time-series data. By utilizing memory gates, LSTMs learn complex nonlinear dependencies. Yet, LSTMs face instability in open-loop forecasting; without ground truth feedback, minor deviations compound recursively, causing divergence. To resolve these limitations, we propose a Hybrid SARIMA-LSTM architecture. This framework employs a residual-learning strategy to decompose temperature into a predictable climate component and a nonlinear weather component. The SARIMA unit models the robust, long-term seasonal trend, while the LSTM is trained exclusively on the residuals the nonlinear errors SARIMA fails to capture. By fusing statistical stability with neural plasticity, this hybrid approach minimizes error propagation and enhances long-horizon accuracy.
Related papers
- From Observations to States: Latent Time Series Forecasting [65.98504021691666]
We propose Latent Time Series Forecasting (LatentTSF), a novel paradigm that shifts TSF from observation regression to latent state prediction.<n>Specifically, LatentTSF employs an AutoEncoder to project observations at each time step into a higher-dimensional latent state space.<n>Our proposed latent objectives implicitly maximize mutual information between predicted latent states and ground-truth states and observations.
arXiv Detail & Related papers (2026-01-30T20:39:44Z) - Interpretable Time Series Autoregression for Periodicity Quantification [18.6300875919604]
Time series autoregression (AR) is a classical tool for auto-correlations and periodic structures in real-world systems.<n>We revisit this model by introducing sparse autoregression (SAR), where $ell$-norm constraints are used to isolate dominant periodicities.<n>We validate our framework on large-scale mobility and climate time series.
arXiv Detail & Related papers (2025-06-28T14:17:11Z) - Advanced long-term earth system forecasting by learning the small-scale nature [74.19833913539053]
We present Triton, an AI framework designed to address this fundamental challenge.<n>Inspired by increasing grids to explicitly resolve small scales in numerical models, Triton employs a hierarchical architecture processing information across multiple resolutions to mitigate spectral bias.<n>We demonstrate Triton's superior performance on challenging forecast tasks, achieving stable year-long global temperature forecasts, skillful Kuroshio eddy predictions till 120 days, and high-fidelity turbulence simulations.
arXiv Detail & Related papers (2025-05-26T02:49:00Z) - CirT: Global Subseasonal-to-Seasonal Forecasting with Geometry-inspired Transformer [47.65152457550307]
We propose the geometric-inspired Circular Transformer (CirT) to model the cyclic characteristic of the graticule.<n>Experiments on the Earth Reanalysis 5 (ERA5) reanalysis dataset demonstrate our model yields a significant improvement over the advanced data-driven models.
arXiv Detail & Related papers (2025-02-27T04:26:23Z) - Deep Learning for Weather Forecasting: A CNN-LSTM Hybrid Model for Predicting Historical Temperature Data [7.559331742876793]
This study introduces a hybrid model combining Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks to predict historical temperature data.
CNNs are utilized for spatial feature extraction, while LSTMs handle temporal dependencies, resulting in significantly improved prediction accuracy and stability.
arXiv Detail & Related papers (2024-10-19T03:38:53Z) - ClimODE: Climate and Weather Forecasting with Physics-informed Neural ODEs [14.095897879222676]
We present ClimODE, a continuous-time process that implements key principle of statistical mechanics.
ClimODE models precise weather evolution with value-conserving dynamics, learning global weather transport as a neural flow.
Our approach outperforms existing data-driven methods in global, regional forecasting with an order of magnitude smaller parameterization.
arXiv Detail & Related papers (2024-04-15T06:38:21Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Parsimony or Capability? Decomposition Delivers Both in Long-term Time Series Forecasting [46.63798583414426]
Long-term time series forecasting (LTSF) represents a critical frontier in time series analysis.
Our study demonstrates, through both analytical and empirical evidence, that decomposition is key to containing excessive model inflation.
Remarkably, by tailoring decomposition to the intrinsic dynamics of time series data, our proposed model outperforms existing benchmarks.
arXiv Detail & Related papers (2024-01-22T13:15:40Z) - Comparing Data-Driven and Mechanistic Models for Predicting Phenology in
Deciduous Broadleaf Forests [47.285748922842444]
We train a deep neural network to predict a phenological index from meteorological time series.
We find that this approach outperforms traditional process-based models.
arXiv Detail & Related papers (2024-01-08T15:29:23Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - Residual Corrective Diffusion Modeling for Km-scale Atmospheric Downscaling [58.456404022536425]
State of the art for physical hazard prediction from weather and climate requires expensive km-scale numerical simulations driven by coarser resolution global inputs.
Here, a generative diffusion architecture is explored for downscaling such global inputs to km-scale, as a cost-effective machine learning alternative.
The model is trained to predict 2km data from a regional weather model over Taiwan, conditioned on a 25km global reanalysis.
arXiv Detail & Related papers (2023-09-24T19:57:22Z) - Challenges of learning multi-scale dynamics with AI weather models: Implications for stability and one solution [0.0]
Current AI-based weather models can only provide short-term forecasts accurately when time-integrated beyond a few weeks or a few months.<n>The cause of the instabilities is unknown, and the methods that are used to improve their stability horizons are ad-hoc and lack rigorous theory.<n>We develop long-term physically-consistent data-driven models for the climate system and demonstrate accurate short-term forecasts.
arXiv Detail & Related papers (2023-04-14T09:49:11Z) - Short-term daily precipitation forecasting with seasonally-integrated
autoencoder [0.0]
We propose a seasonally-integrated autoencoder (SSAE) consisting of two long short-term memory (LSTM) autoencoders.
Our experimental results show that not only does the SSAE outperform various time series models regardless of the climate type, but it also has low output variance compared to other deep learning models.
arXiv Detail & Related papers (2021-01-23T14:19:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.