Neural and Time-Series Approaches for Pricing Weather Derivatives: Performance and Regime Adaptation Using Satellite Data
- URL: http://arxiv.org/abs/2411.12013v2
- Date: Fri, 02 May 2025 19:07:04 GMT
- Title: Neural and Time-Series Approaches for Pricing Weather Derivatives: Performance and Regime Adaptation Using Satellite Data
- Authors: Marco Hening Tallarico, Pablo Olivares,
- Abstract summary: This paper studies pricing of weather-derivative (WD) contracts on temperature and precipitation.<n>We benchmark a harmonic-regression/ARMA model against a feed-forward neural network (NN), finding that the NN reduces out-of-sample mean-squared error (MSE)<n>For precipitation, we employ a compound Poisson--Gamma framework: shape and scale parameters are estimated via maximum likelihood estimation (MLE) and via a convolutional neural network (CNN) trained on 30-day rainfall sequences spanning multiple seasons.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies pricing of weather-derivative (WD) contracts on temperature and precipitation. For temperature-linked strangles in Toronto and Chicago, we benchmark a harmonic-regression/ARMA model against a feed-forward neural network (NN), finding that the NN reduces out-of-sample mean-squared error (MSE) and materially shifts December fair values relative to both the time-series model and the industry-standard Historic Burn Approach (HBA). For precipitation, we employ a compound Poisson--Gamma framework: shape and scale parameters are estimated via maximum likelihood estimation (MLE) and via a convolutional neural network (CNN) trained on 30-day rainfall sequences spanning multiple seasons. The CNN adaptively learns season-specific $(\alpha,\beta)$ mappings, thereby capturing heterogeneity across regimes that static i.i.d.\ fits miss. At valuation, we assume days are i.i.d.\ $\Gamma(\hat{\alpha},\hat{\beta})$ within each regime and apply a mean-count approximation (replacing the Poisson count by its mean ($n\hat{\lambda}$) to derive closed-form strangle prices. Exploratory analysis of 1981--2023 NASA POWER data confirms pronounced seasonal heterogeneity in $(\alpha,\beta)$ between summer and winter, demonstrating that static global fits are inadequate. Back-testing on Toronto and Chicago grids shows that our regime-adaptive CNN yields competitive valuations and underscores how model choice can shift strangle prices. Payoffs are evaluated analytically when possible and by simulation elsewhere, enabling a like-for-like comparison of forecasting and valuation methods.
Related papers
- Localized Weather Prediction Using Kolmogorov-Arnold Network-Based Models and Deep RNNs [0.0]
This study benchmarks deep recurrent neural networks such as $textttLSTM, GRU, BiLSTM, BiGRU$, and Kolmogorov-Arnold-based models $(texttKAN and textttTKAN)$ for daily forecasting of temperature, precipitation, and pressure in two tropical cities.<n>We introduce two customized variants of $ textttTKAN$ that replace its original $textttSiLU$ activation function with $ textttGeLU$ and
arXiv Detail & Related papers (2025-05-27T18:01:57Z) - Forecasting Cryptocurrency Prices using Contextual ES-adRNN with Exogenous Variables [3.0108936184913295]
We introduce a new approach to multivariate forecasting cryptocurrency prices using a hybrid contextual model combining exponential smoothing (ES) and recurrent neural network (RNN)
The model generates both point daily forecasts and predictive intervals for one-day, one-week and four-week horizons.
We apply our model to forecast prices of 15 cryptocurrencies based on 17 input variables and compare its performance with that of comparative models, including both statistical and ML ones.
arXiv Detail & Related papers (2025-04-11T20:00:03Z) - Towards Location-Specific Precipitation Projections Using Deep Neural Networks [0.0]
This study presents a paradigm shift by leveraging Deep Neural Networks (DNNs) to surpass traditional methods like Kriging for station-specific precipitation approximation.
We propose two innovative NN architectures: one utilizing precipitation, elevation, and location, and another incorporating additional meteorological parameters like humidity, temperature, and wind speed.
arXiv Detail & Related papers (2025-03-18T10:12:17Z) - Coarse Graining with Neural Operators for Simulating Chaotic Systems [78.64101336150419]
Predicting the long-term behavior of chaotic systems is crucial for various applications such as climate modeling.<n>An alternative approach to such a full-resolved simulation is using a coarse grid and then correcting its errors through a temporalittext model.<n>We propose an alternative end-to-end learning approach using a physics-informed neural operator (PINO) that overcomes this limitation.
arXiv Detail & Related papers (2024-08-09T17:05:45Z) - Generating Fine-Grained Causality in Climate Time Series Data for Forecasting and Anomaly Detection [67.40407388422514]
We design a conceptual fine-grained causal model named TBN Granger Causality.
Second, we propose an end-to-end deep generative model called TacSas, which discovers TBN Granger Causality in a generative manner.
We test TacSas on climate benchmark ERA5 for climate forecasting and the extreme weather benchmark of NOAA for extreme weather alerts.
arXiv Detail & Related papers (2024-08-08T06:47:21Z) - Interval Forecasts for Gas Prices in the Face of Structural Breaks -- Statistical Models vs. Neural Networks [0.0]
We investigate whether modern machine learning methods such as neural networks are more resilient against such changes.
We see that, during the shock period, most models underestimate the variance while overestimating the variance in the after-shock period.
Interestingly, the widely-used long-short term neural network is outperformed by its competitors.
arXiv Detail & Related papers (2024-07-23T11:34:13Z) - Encoding Seasonal Climate Predictions for Demand Forecasting with
Modular Neural Network [0.8378605337114742]
We propose a novel framework that encodes seasonal climate predictions to provide robust and reliable time-series forecasting for supply chain functions.
Our experiments indicate learning such representations to model seasonal climate forecast results in an error reduction of approximately 13% to 17% across multiple real-world data sets.
arXiv Detail & Related papers (2023-09-05T13:58:59Z) - Extreme heatwave sampling and prediction with analog Markov chain and
comparisons with deep learning [0.0]
We present a data-driven emulator, weather generator (SWG), suitable for estimating probabilities of heatwaves in France and Scandinavia.
We train the emulator on an intermediate complexity climate model run and show that it is capable of predicting conditional probabilities (forecasting) of heatwaves out of sample.
The probabilistic prediction achieved with SWG is compared with the one achieved with Convolutional Neural Network (CNN)
arXiv Detail & Related papers (2023-07-18T08:25:14Z) - Hedonic Prices and Quality Adjusted Price Indices Powered by AI [4.125713429211907]
We develop empirical hedonic models that process large amounts of unstructured product data.
We produce accurate hedonic price estimates and derived indices.
We construct the AI-based hedonic Fisher price index, chained at the year-over-year frequency.
arXiv Detail & Related papers (2023-04-28T18:37:59Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Pangu-Weather: A 3D High-Resolution Model for Fast and Accurate Global
Weather Forecast [91.9372563527801]
We present Pangu-Weather, a deep learning based system for fast and accurate global weather forecast.
For the first time, an AI-based method outperforms state-of-the-art numerical weather prediction (NWP) methods in terms of accuracy.
Pangu-Weather supports a wide range of downstream forecast scenarios, including extreme weather forecast and large-member ensemble forecast in real-time.
arXiv Detail & Related papers (2022-11-03T17:19:43Z) - Intelligent Spatial Interpolation-based Frost Prediction Methodology
using Artificial Neural Networks with Limited Local Data [3.3607307817827032]
The aim of this article is to eliminate the dependency on on-site historical data and sensors for frost prediction methods.
The models use climate data from existing weather stations, digital elevation models surveys, and normalized difference vegetation index data to estimate a target site's next hour minimum temperature.
arXiv Detail & Related papers (2022-04-15T21:14:07Z) - Forecasting large-scale circulation regimes using deformable
convolutional neural networks and global spatiotemporal climate data [86.1450118623908]
We investigate a supervised machine learning approach based on deformable convolutional neural networks (deCNNs)
We forecast the North Atlantic-European weather regimes during extended boreal winter for 1 to 15 days into the future.
Due to its wider field of view, we also observe deCNN achieving considerably better performance than regular convolutional neural networks at lead times beyond 5-6 days.
arXiv Detail & Related papers (2022-02-10T11:37:00Z) - Datamodels: Predicting Predictions from Training Data [86.66720175866415]
We present a conceptual framework, datamodeling, for analyzing the behavior of a model class in terms of the training data.
We show that even simple linear datamodels can successfully predict model outputs.
arXiv Detail & Related papers (2022-02-01T18:15:24Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - On projection methods for functional time series forecasting [0.0]
Two nonparametric methods are presented for forecasting functional time series (FTS)
We address both one-step-ahead forecasting and dynamic updating.
The methods are applied to simulated data, daily electricity demand, and NOx emissions.
arXiv Detail & Related papers (2021-05-10T14:24:38Z) - Forecasting The JSE Top 40 Using Long Short-Term Memory Networks [1.6114012813668934]
This paper uses a long-short term memory network to perform financial time series forecasting on the return data of the JSE Top 40 index.
The paper concludes that the long short-term memory network outperforms the seasonal autoregressive integrated moving average model.
arXiv Detail & Related papers (2021-04-20T09:39:38Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z) - Deep learning for gravitational-wave data analysis: A resampling
white-box approach [62.997667081978825]
We apply Convolutional Neural Networks (CNNs) to detect gravitational wave (GW) signals of compact binary coalescences, using single-interferometer data from LIGO detectors.
CNNs were quite precise to detect noise but not sensitive enough to recall GW signals, meaning that CNNs are better for noise reduction than generation of GW triggers.
arXiv Detail & Related papers (2020-09-09T03:28:57Z) - A generative adversarial network approach to (ensemble) weather
prediction [91.3755431537592]
We use a conditional deep convolutional generative adversarial network to predict the geopotential height of the 500 hPa pressure level, the two-meter temperature and the total precipitation for the next 24 hours over Europe.
The proposed models are trained on 4 years of ERA5 reanalysis data from 2015-2018 with the goal to predict the associated meteorological fields in 2019.
arXiv Detail & Related papers (2020-06-13T20:53:17Z) - Stock Price Prediction Using Convolutional Neural Networks on a
Multivariate Timeseries [0.0]
We build various predictive models using machine learning approaches, and then use those models to predict the Close value of NIFTY 50 for the year 2019.
For predicting the NIFTY index movement patterns, we use a number of classification methods, while for forecasting the actual Close values of NIFTY index, various regression models are built.
We exploit the power of CNN in forecasting the future NIFTY index values using three approaches which differ in number of variables used in forecasting.
arXiv Detail & Related papers (2020-01-10T03:27:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.