A Physics-Constrained Neural Differential Equation Framework for Data-Driven Snowpack Simulation
- URL: http://arxiv.org/abs/2412.06819v2
- Date: Sat, 15 Mar 2025 20:11:16 GMT
- Title: A Physics-Constrained Neural Differential Equation Framework for Data-Driven Snowpack Simulation
- Authors: Andrew Charbonneau, Katherine Deck, Tapio Schneider,
- Abstract summary: This paper presents a physics-constrained neural differential equation framework for parameterization.<n>It predicts daily snow depth with under 9% median error and Nash Sutcliffe Efficiencies over 0.94 across a wide variety of snow climates.<n>The structure of the approach guarantees the satisfaction of physical constraints, enables these constraints during model training, and allows modeling at different temporal resolutions without additional retraining of the parameterization.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a physics-constrained neural differential equation framework for parameterization, and employs it to model the time evolution of seasonal snow depth given hydrometeorological forcings. When trained on data from multiple SNOTEL sites, the parameterization predicts daily snow depth with under 9% median error and Nash Sutcliffe Efficiencies over 0.94 across a wide variety of snow climates. The parameterization also generalizes to new sites not seen during training, which is not often true for calibrated snow models. Requiring the parameterization to predict snow water equivalent in addition to snow depth only increases error to ~12%. The structure of the approach guarantees the satisfaction of physical constraints, enables these constraints during model training, and allows modeling at different temporal resolutions without additional retraining of the parameterization. These benefits hold potential in climate modeling, and could extend to other dynamical systems with physical constraints.
Related papers
- Quantum Neural Networks for Cloud Cover Parameterizations in Climate Models [0.08271752505511926]
Machine learning models trained on short high-resolution climate simulations are promising candidates to replace conventional parameterizations.
QNNs differ from their classical counterparts, and their potentially high expressivity turns them into promising tools for accurate data-driven schemes.
We show that the overall performance of the investigated QNNs is comparable to that of classical NNs of similar size, with the same number of trainable parameters.
arXiv Detail & Related papers (2025-02-14T13:02:49Z) - Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Parametric model reduction of mean-field and stochastic systems via higher-order action matching [1.1509084774278489]
We learn models of population dynamics of physical systems that feature gradient and mean-field effects.
We show that our approach accurately predicts population dynamics over a wide range of parameters and outperforms state-of-the-art diffusion-based and flow-based modeling.
arXiv Detail & Related papers (2024-10-15T19:05:28Z) - Towards Physically Consistent Deep Learning For Climate Model Parameterizations [46.07009109585047]
parameterizations are a major source of systematic errors and large uncertainties in climate projections.
Deep learning (DL)-based parameterizations, trained on data from computationally expensive short, high-resolution simulations, have shown great promise for improving climate models.
We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models.
arXiv Detail & Related papers (2024-06-06T10:02:49Z) - Generalizing Weather Forecast to Fine-grained Temporal Scales via Physics-AI Hybrid Modeling [55.13352174687475]
This paper proposes a physics-AI hybrid model (i.e., WeatherGFT) which generalizes weather forecasts to finer-grained temporal scales beyond training dataset.
Specifically, we employ a carefully designed PDE kernel to simulate physical evolution on a small time scale.
We also introduce a lead time-aware training framework to promote the generalization of the model at different lead times.
arXiv Detail & Related papers (2024-05-22T16:21:02Z) - Differentially Private Gradient Flow based on the Sliced Wasserstein Distance [59.1056830438845]
We introduce a novel differentially private generative modeling approach based on a gradient flow in the space of probability measures.
Experiments show that our proposed model can generate higher-fidelity data at a low privacy budget.
arXiv Detail & Related papers (2023-12-13T15:47:30Z) - A Three-regime Model of Network Pruning [47.92525418773768]
We use temperature-like and load-like parameters to model the impact of neural network (NN) training hyper parameters on pruning performance.
A key empirical result we identify is a sharp transition phenomenon: depending on the value of a load-like parameter in the pruned model, increasing the value of a temperature-like parameter in the pre-pruned model may either enhance or impair subsequent pruning performance.
Our model reveals that the dichotomous effect of high temperature is associated with transitions between distinct types of global structures in the post-pruned model.
arXiv Detail & Related papers (2023-05-28T08:09:25Z) - Physics-constrained deep learning postprocessing of temperature and
humidity [0.0]
We propose to achieve physical consistency in deep learning-based postprocessing models.
We find that constraining a neural network to enforce thermodynamic state equations yields physically-consistent predictions.
arXiv Detail & Related papers (2022-12-07T09:31:25Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Combining data assimilation and machine learning to estimate parameters
of a convective-scale model [0.0]
Errors in the representation of clouds in convection-permitting numerical weather prediction models can be introduced by different sources.
In this work, we look at the problem of parameter estimation through an artificial intelligence lens by training two types of artificial neural networks.
arXiv Detail & Related papers (2021-09-07T09:17:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.