Towards Optimally Weighted Physics-Informed Neural Networks in Ocean
Modelling
- URL: http://arxiv.org/abs/2106.08747v1
- Date: Wed, 16 Jun 2021 12:48:13 GMT
- Title: Towards Optimally Weighted Physics-Informed Neural Networks in Ocean
Modelling
- Authors: Taco de Wolff (CIRIC), Hugo Carrillo (CIRIC), Luis Mart{\'i} (CIRIC),
Nayat Sanchez-Pi (CIRIC)
- Abstract summary: State-of-the-art techniques are required to develop models that can capture the complexity of ocean currents and temperature flows.
This work explores the benefits of using physics-informed neural networks (PINNs) for solving partial differential equations related to ocean modeling.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The carbon pump of the world's ocean plays a vital role in the biosphere and
climate of the earth, urging improved understanding of the functions and
influences of the ocean for climate change analyses. State-of-the-art
techniques are required to develop models that can capture the complexity of
ocean currents and temperature flows. This work explores the benefits of using
physics-informed neural networks (PINNs) for solving partial differential
equations related to ocean modeling; such as the Burgers, wave, and
advection-diffusion equations. We explore the trade-offs of using data vs.
physical models in PINNs for solving partial differential equations. PINNs
account for the deviation from physical laws in order to improve learning and
generalization. We observed how the relative weight between the data and
physical model in the loss function influence training results, where small
data sets benefit more from the added physics information.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Physics-Informed Deep Learning of Rate-and-State Fault Friction [0.0]
We develop a multi-network PINN for both the forward problem and for direct inversion of nonlinear fault friction parameters.
We present the computational PINN framework for strike-slip faults in 1D and 2D subject to rate-and-state friction.
We find that the network for the parameter inversion at the fault performs much better than the network for material displacements to which it is coupled.
arXiv Detail & Related papers (2023-12-14T23:53:25Z) - Surrogate Neural Networks to Estimate Parametric Sensitivity of Ocean
Models [2.956865819041394]
Ocean processes affect phenomena such as hurricanes and droughts.
For an idealized ocean model, we generated perturbed parameter ensemble data and trained surrogate neural network models.
The neural surrogates accurately predicted the one-step forward dynamics, of which we then computed the parametric sensitivity.
arXiv Detail & Related papers (2023-11-10T16:37:43Z) - Physics-Informed Machine Learning of Argon Gas-Driven Melt Pool Dynamics [0.0]
Melt pool dynamics in metal additive manufacturing (AM) is critical to process stability, microstructure formation, and final properties of the printed materials.
This paper provides a physics-informed machine learning (PIML) method by integrating neural networks with the governing physical laws to predict the melt pool dynamics.
The data-efficient PINN model is attributed to the soft penalty by incorporating governing partial differential equations (PDEs), initial conditions, and boundary conditions in the PINN model.
arXiv Detail & Related papers (2023-07-23T12:12:44Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Physics-informed deep-learning applications to experimental fluid
mechanics [2.992602379681373]
High-resolution reconstruction of flow-field data from low-resolution and noisy measurements is of interest in experimental fluid mechanics.
Deep-learning approaches have been shown suitable for such super-resolution tasks.
In this study, we apply physics-informed neural networks (PINNs) for super-resolution of flow-field data in time and space.
arXiv Detail & Related papers (2022-03-29T09:58:30Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Turbulence Enrichment using Physics-informed Generative Adversarial
Networks [0.0]
We develop methods for generative enrichment of turbulence.
We incorporate a physics-informed learning approach by a modification to the loss function.
We show that using the physics-informed learning can also significantly improve the model's ability in generating data that satisfies the physical governing equations.
arXiv Detail & Related papers (2020-03-04T06:14:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.