Physics-Guided, Physics-Informed, and Physics-Encoded Neural Networks in
Scientific Computing
- URL: http://arxiv.org/abs/2211.07377v1
- Date: Mon, 14 Nov 2022 15:44:07 GMT
- Title: Physics-Guided, Physics-Informed, and Physics-Encoded Neural Networks in
Scientific Computing
- Authors: Salah A Faroughi, Nikhil Pawar, Celio Fernandes, Subasish Das, Nima K.
Kalantari, Seyed Kourosh Mahjour
- Abstract summary: Recent breakthroughs in computing power have made it feasible to use machine learning and deep learning to advance scientific computing.
Due to their intrinsic architecture, conventional neural networks cannot be successfully trained and scoped when data is sparse.
Neural networks offer a strong foundation to digest physical-driven or knowledge-based constraints.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent breakthroughs in computing power have made it feasible to use machine
learning and deep learning to advance scientific computing in many fields, such
as fluid mechanics, solid mechanics, materials science, etc. Neural networks,
in particular, play a central role in this hybridization. Due to their
intrinsic architecture, conventional neural networks cannot be successfully
trained and scoped when data is sparse; a scenario that is true in many
scientific fields. Nonetheless, neural networks offer a strong foundation to
digest physical-driven or knowledge-based constraints during training.
Generally speaking, there are three distinct neural network frameworks to
enforce underlying physics: (i) physics-guided neural networks (PgNN), (ii)
physics-informed neural networks (PiNN) and (iii) physics-encoded neural
networks (PeNN). These approaches offer unique advantages to accelerate the
modeling of complex multiscale multi-physics phenomena. They also come with
unique drawbacks and suffer from unresolved limitations (e.g., stability,
convergence, and generalization) that call for further research. This study
aims to present an in-depth review of the three neural network frameworks
(i.e., PgNN, PiNN, and PeNN) used in scientific computing research. The
state-of-the-art architectures and their applications are reviewed; limitations
are discussed; and future research opportunities in terms of improving
algorithms, considering causalities, expanding applications, and coupling
scientific and deep learning solvers are presented. This critical review
provides a solid starting point for researchers and engineers to comprehend how
to integrate different layers of physics into neural networks.
Related papers
- From Graphs to Qubits: A Critical Review of Quantum Graph Neural Networks [56.51893966016221]
Quantum Graph Neural Networks (QGNNs) represent a novel fusion of quantum computing and Graph Neural Networks (GNNs)
This paper critically reviews the state-of-the-art in QGNNs, exploring various architectures.
We discuss their applications across diverse fields such as high-energy physics, molecular chemistry, finance and earth sciences, highlighting the potential for quantum advantage.
arXiv Detail & Related papers (2024-08-12T22:53:14Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - A Review of Neuroscience-Inspired Machine Learning [58.72729525961739]
Bio-plausible credit assignment is compatible with practically any learning condition and is energy-efficient.
In this paper, we survey several vital algorithms that model bio-plausible rules of credit assignment in artificial neural networks.
We conclude by discussing the future challenges that will need to be addressed in order to make such algorithms more useful in practical applications.
arXiv Detail & Related papers (2024-02-16T18:05:09Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - Physics-informed ConvNet: Learning Physical Field from a Shallow Neural
Network [0.180476943513092]
Modelling and forecasting multi-physical systems remain a challenge due to unavoidable data scarcity and noise.
New framework named physics-informed convolutional network (PICN) is recommended from a CNN perspective.
PICN may become an alternative neural network solver in physics-informed machine learning.
arXiv Detail & Related papers (2022-01-26T14:35:58Z) - Deep physical neural networks enabled by a backpropagation algorithm for
arbitrary physical systems [3.7785805908699803]
We propose a radical alternative for implementing deep neural network models: Physical Neural Networks.
We introduce a hybrid physical-digital algorithm called Physics-Aware Training to efficiently train sequences of controllable physical systems to act as deep neural networks.
arXiv Detail & Related papers (2021-04-27T18:00:02Z) - Explainable artificial intelligence for mechanics: physics-informing
neural networks for constitutive models [0.0]
In mechanics, the new and active field of physics-informed neural networks attempts to mitigate this disadvantage by designing deep neural networks on the basis of mechanical knowledge.
We propose a first step towards a physics-forming-in approach, which explains neural networks trained on mechanical data a posteriori.
Therein, the principal component analysis decorrelates the distributed representations in cell states of RNNs and allows the comparison to known and fundamental functions.
arXiv Detail & Related papers (2021-04-20T18:38:52Z) - A deep learning theory for neural networks grounded in physics [2.132096006921048]
We argue that building large, fast and efficient neural networks on neuromorphic architectures requires rethinking the algorithms to implement and train them.
Our framework applies to a very broad class of models, namely systems whose state or dynamics are described by variational equations.
arXiv Detail & Related papers (2021-03-18T02:12:48Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Understanding and mitigating gradient pathologies in physics-informed
neural networks [2.1485350418225244]
This work focuses on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data.
We present a learning rate annealing algorithm that utilizes gradient statistics during model training to balance the interplay between different terms in composite loss functions.
We also propose a novel neural network architecture that is more resilient to such gradient pathologies.
arXiv Detail & Related papers (2020-01-13T21:23:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.