Can physical information aid the generalization ability of Neural
Networks for hydraulic modeling?
- URL: http://arxiv.org/abs/2403.08589v1
- Date: Wed, 13 Mar 2024 14:51:16 GMT
- Title: Can physical information aid the generalization ability of Neural
Networks for hydraulic modeling?
- Authors: Gianmarco Guglielmo, Andrea Montessori, Jean-Michel Tucny, Michele La
Rocca, Pietro Prestininzi
- Abstract summary: Application of Neural Networks to river hydraulics is fledgling, despite the field suffering from data scarcity.
We propose to mitigate such problem by introducing physical information into the training phase.
We show that incorporating such soft physical information can improve predictive capabilities.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Application of Neural Networks to river hydraulics is fledgling, despite the
field suffering from data scarcity, a challenge for machine learning
techniques. Consequently, many purely data-driven Neural Networks proved to
lack predictive capabilities. In this work, we propose to mitigate such problem
by introducing physical information into the training phase. The idea is
borrowed from Physics-Informed Neural Networks which have been recently
proposed in other contexts. Physics-Informed Neural Networks embed physical
information in the form of the residual of the Partial Differential Equations
(PDEs) governing the phenomenon and, as such, are conceived as neural solvers,
i.e. an alternative to traditional numerical solvers. Such approach is seldom
suitable for environmental hydraulics, where epistemic uncertainties are large,
and computing residuals of PDEs exhibits difficulties similar to those faced by
classical numerical methods. Instead, we envisaged the employment of Neural
Networks as neural operators, featuring physical constraints formulated without
resorting to PDEs. The proposed novel methodology shares similarities with data
augmentation and regularization. We show that incorporating such soft physical
information can improve predictive capabilities.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - Multi-fidelity physics constrained neural networks for dynamical systems [16.6396704642848]
We propose the Multi-Scale Physics-Constrained Neural Network (MSPCNN)
MSPCNN offers a novel methodology for incorporating data with different levels of fidelity into a unified latent space.
Unlike conventional methods, MSPCNN also manages to employ multi-fidelity data to train the predictive model.
arXiv Detail & Related papers (2024-02-03T05:05:26Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Bayesian Physics-Informed Neural Networks for real-world nonlinear
dynamical systems [0.0]
We integrate data, physics, and uncertainties by combining neural networks, physics-informed modeling, and Bayesian inference.
Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both.
We anticipate that the underlying concepts and trends generalize to more complex disease conditions.
arXiv Detail & Related papers (2022-05-12T19:04:31Z) - Neural Implicit Representations for Physical Parameter Inference from a Single Video [49.766574469284485]
We propose to combine neural implicit representations for appearance modeling with neural ordinary differential equations (ODEs) for modelling physical phenomena.
Our proposed model combines several unique advantages: (i) Contrary to existing approaches that require large training datasets, we are able to identify physical parameters from only a single video.
The use of neural implicit representations enables the processing of high-resolution videos and the synthesis of photo-realistic images.
arXiv Detail & Related papers (2022-04-29T11:55:35Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Physics-informed ConvNet: Learning Physical Field from a Shallow Neural
Network [0.180476943513092]
Modelling and forecasting multi-physical systems remain a challenge due to unavoidable data scarcity and noise.
New framework named physics-informed convolutional network (PICN) is recommended from a CNN perspective.
PICN may become an alternative neural network solver in physics-informed machine learning.
arXiv Detail & Related papers (2022-01-26T14:35:58Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Understanding and mitigating gradient pathologies in physics-informed
neural networks [2.1485350418225244]
This work focuses on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data.
We present a learning rate annealing algorithm that utilizes gradient statistics during model training to balance the interplay between different terms in composite loss functions.
We also propose a novel neural network architecture that is more resilient to such gradient pathologies.
arXiv Detail & Related papers (2020-01-13T21:23:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.