Fully Differentiable Lagrangian Convolutional Neural Network for
Continuity-Consistent Physics-Informed Precipitation Nowcasting
- URL: http://arxiv.org/abs/2402.10747v1
- Date: Fri, 16 Feb 2024 15:13:30 GMT
- Title: Fully Differentiable Lagrangian Convolutional Neural Network for
Continuity-Consistent Physics-Informed Precipitation Nowcasting
- Authors: Peter Pavl\'ik, Martin V\'yboh, Anna Bou Ezzeddine, Viera Rozinajov\'a
- Abstract summary: We present a convolutional neural network model for precipitation nowcasting that combines data-driven learning with physics-informed domain knowledge.
We propose LUPIN, a Lagrangian Double U-Net for Physics-Informed Nowcasting, that draws from existing extrapolation-based nowcasting methods.
Based on our evaluation, LUPIN matches and exceeds the performance of the chosen benchmark, opening the door for other Lagrangian machine learning models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a convolutional neural network model for precipitation
nowcasting that combines data-driven learning with physics-informed domain
knowledge. We propose LUPIN, a Lagrangian Double U-Net for Physics-Informed
Nowcasting, that draws from existing extrapolation-based nowcasting methods and
implements the Lagrangian coordinate system transformation of the data in a
fully differentiable and GPU-accelerated manner to allow for real-time
end-to-end training and inference. Based on our evaluation, LUPIN matches and
exceeds the performance of the chosen benchmark, opening the door for other
Lagrangian machine learning models.
Related papers
- Data-Driven Dynamic Friction Models based on Recurrent Neural Networks [0.0]
Recurrent Neural Networks (RNNs) based on Gated Recurrent Unit (GRU) architecture, learn complex dynamics of rate-and-state friction laws from synthetic data.
It is found that the GRU-based RNNs effectively learns to predict changes in the friction coefficient resulting from velocity jumps.
arXiv Detail & Related papers (2024-02-21T22:11:01Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Generalized Lagrangian Neural Networks [8.065464912030352]
We introduce a groundbreaking extension (Genralized Lagrangian Neural Networks) to Lagrangian Neural Networks (LNNs)
By leveraging the foundational importance of the Lagrangian within Lagrange's equations, we formulate the model based on the generalized Lagrange's equation.
This modification not only enhances prediction accuracy but also guarantees Lagrangian representation in non-conservative systems.
arXiv Detail & Related papers (2024-01-08T08:26:40Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Lagrangian Density Space-Time Deep Neural Network Topology [0.0]
We have proposed a "Lagrangian Density Space-Time Deep Neural Networks" (LDDNN) topology.
It is qualified for unsupervised training and learning to predict the dynamics of underlying physical science governed phenomena.
This article will discuss statistical physics interpretation of neural networks in the Lagrangian and Hamiltonian domains.
arXiv Detail & Related papers (2022-06-30T03:29:35Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Neural Galerkin Schemes with Active Learning for High-Dimensional
Evolution Equations [44.89798007370551]
This work proposes Neural Galerkin schemes based on deep learning that generate training data with active learning for numerically solving high-dimensional partial differential equations.
Neural Galerkin schemes build on the Dirac-Frenkel variational principle to train networks by minimizing the residual sequentially over time.
Our finding is that the active form of gathering training data of the proposed Neural Galerkin schemes is key for numerically realizing the expressive power of networks in high dimensions.
arXiv Detail & Related papers (2022-03-02T19:09:52Z) - Fast-Convergent Federated Learning [82.32029953209542]
Federated learning is a promising solution for distributing machine learning tasks through modern networks of mobile devices.
We propose a fast-convergent federated learning algorithm, called FOLB, which performs intelligent sampling of devices in each round of model training.
arXiv Detail & Related papers (2020-07-26T14:37:51Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z) - A deep learning framework for solution and discovery in solid mechanics [1.4699455652461721]
We present the application of a class of deep learning, known as Physics Informed Neural Networks (PINN), to learning and discovery in solid mechanics.
We explain how to incorporate the momentum balance and elasticity relations into PINN, and explore in detail the application to linear elasticity.
arXiv Detail & Related papers (2020-02-14T08:24:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.