Tipping Point Forecasting in Non-Stationary Dynamics on Function Spaces
- URL: http://arxiv.org/abs/2308.08794v1
- Date: Thu, 17 Aug 2023 05:42:27 GMT
- Title: Tipping Point Forecasting in Non-Stationary Dynamics on Function Spaces
- Authors: Miguel Liu-Schiaffini, Clare E. Singer, Nikola Kovachki, Tapio
Schneider, Kamyar Azizzadenesheli, Anima Anandkumar
- Abstract summary: Tipping points are abrupt, drastic, and often irreversible changes in the evolution of non-stationary dynamical systems.
We learn the evolution of such non-stationary systems using a novel recurrent neural operator (RNO), which learns mappings between function spaces.
We propose a conformal prediction framework to forecast tipping points by monitoring deviations from physics constraints.
- Score: 78.08947381962658
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tipping points are abrupt, drastic, and often irreversible changes in the
evolution of non-stationary and chaotic dynamical systems. For instance,
increased greenhouse gas concentrations are predicted to lead to drastic
decreases in low cloud cover, referred to as a climatological tipping point. In
this paper, we learn the evolution of such non-stationary dynamical systems
using a novel recurrent neural operator (RNO), which learns mappings between
function spaces. After training RNO on only the pre-tipping dynamics, we employ
it to detect future tipping points using an uncertainty-based approach. In
particular, we propose a conformal prediction framework to forecast tipping
points by monitoring deviations from physics constraints (such as conserved
quantities and partial differential equations), enabling forecasting of these
abrupt changes along with a rigorous measure of uncertainty. We illustrate our
proposed methodology on non-stationary ordinary and partial differential
equations, such as the Lorenz-63 and Kuramoto-Sivashinsky equations. We also
apply our methods to forecast a climate tipping point in stratocumulus cloud
cover. In our experiments, we demonstrate that even partial or approximate
physics constraints can be used to accurately forecast future tipping points.
Related papers
- Restoring Kibble-Zurek Scaling and Defect Freezing in Non-Hermitian Systems under Biorthogonal Framework [1.9460072625303615]
We develop a theoretical framework based on time-dependent biorthogonal quantum formalism.
We study the nonadiabatic dynamics of a linearly driven non-Hermitian system.
arXiv Detail & Related papers (2024-10-31T05:01:00Z) - ClimODE: Climate and Weather Forecasting with Physics-informed Neural ODEs [14.095897879222676]
We present ClimODE, a continuous-time process that implements key principle of statistical mechanics.
ClimODE models precise weather evolution with value-conserving dynamics, learning global weather transport as a neural flow.
Our approach outperforms existing data-driven methods in global, regional forecasting with an order of magnitude smaller parameterization.
arXiv Detail & Related papers (2024-04-15T06:38:21Z) - Self-Supervised Class-Agnostic Motion Prediction with Spatial and Temporal Consistency Regularizations [53.797896854533384]
Class-agnostic motion prediction methods directly predict the motion of the entire point cloud.
While most existing methods rely on fully-supervised learning, the manual labeling of point cloud data is laborious and time-consuming.
We introduce three simple spatial and temporal regularization losses, which facilitate the self-supervised training process effectively.
arXiv Detail & Related papers (2024-03-20T02:58:45Z) - Uncertainty Quantification for Forward and Inverse Problems of PDEs via
Latent Global Evolution [110.99891169486366]
We propose a method that integrates efficient and precise uncertainty quantification into a deep learning-based surrogate model.
Our method endows deep learning-based surrogate models with robust and efficient uncertainty quantification capabilities for both forward and inverse problems.
Our method excels at propagating uncertainty over extended auto-regressive rollouts, making it suitable for scenarios involving long-term predictions.
arXiv Detail & Related papers (2024-02-13T11:22:59Z) - DiffCast: A Unified Framework via Residual Diffusion for Precipitation Nowcasting [20.657502066923023]
Precipitation nowcasting is an important task to predict the radar echoes sequences based on current observations, which can serve both meteorological science and smart city applications.
Previous studies address the problem either from the perspectives of deterministic modeling or probabilistic modeling.
We propose to decompose and model the chaotic evolutionary precipitation systems from the perspective of global deterministic motion and local variations with residual mechanism.
arXiv Detail & Related papers (2023-12-11T11:26:32Z) - Self-Supervised Pre-Training for Precipitation Post-Processor [1.5553847214012175]
We propose a deep learning-based precipitation post-processor for numerical weather prediction (NWP) models.
Our experiments on precipitation correction for regional NWP datasets show that the proposed method outperforms other approaches.
arXiv Detail & Related papers (2023-10-31T05:13:10Z) - Uncovering the Missing Pattern: Unified Framework Towards Trajectory
Imputation and Prediction [60.60223171143206]
Trajectory prediction is a crucial undertaking in understanding entity movement or human behavior from observed sequences.
Current methods often assume that the observed sequences are complete while ignoring the potential for missing values.
This paper presents a unified framework, the Graph-based Conditional Variational Recurrent Neural Network (GC-VRNN), which can perform trajectory imputation and prediction simultaneously.
arXiv Detail & Related papers (2023-03-28T14:27:27Z) - On Convergence of Training Loss Without Reaching Stationary Points [62.41370821014218]
We show that Neural Network weight variables do not converge to stationary points where the gradient the loss function vanishes.
We propose a new perspective based on ergodic theory dynamical systems.
arXiv Detail & Related papers (2021-10-12T18:12:23Z) - Noise and Fluctuation of Finite Learning Rate Stochastic Gradient
Descent [3.0079490585515343]
gradient descent (SGD) is relatively well understood in the vanishing learning rate regime.
We propose to study the basic properties of SGD and its variants in the non-vanishing learning rate regime.
arXiv Detail & Related papers (2020-12-07T12:31:43Z) - Learnable Uncertainty under Laplace Approximations [65.24701908364383]
We develop a formalism to explicitly "train" the uncertainty in a decoupled way to the prediction itself.
We show that such units can be trained via an uncertainty-aware objective, improving standard Laplace approximations' performance.
arXiv Detail & Related papers (2020-10-06T13:43:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.