Scalable algorithms for physics-informed neural and graph networks
- URL: http://arxiv.org/abs/2205.08332v1
- Date: Mon, 16 May 2022 15:46:11 GMT
- Title: Scalable algorithms for physics-informed neural and graph networks
- Authors: Khemraj Shukla, Mengjia Xu, Nathaniel Trask and George Em Karniadakis
- Abstract summary: Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems.
In PIML, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space-time domain.
We review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation.
- Score: 0.6882042556551611
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-informed machine learning (PIML) has emerged as a promising new
approach for simulating complex physical and biological systems that are
governed by complex multiscale processes for which some data are also
available. In some instances, the objective is to discover part of the hidden
physics from the available data, and PIML has been shown to be particularly
effective for such problems for which conventional methods may fail. Unlike
commercial machine learning where training of deep neural networks requires big
data, in PIML big data are not available. Instead, we can train such networks
from additional information obtained by employing the physical laws and
evaluating them at random points in the space-time domain. Such
physics-informed machine learning integrates multimodality and multifidelity
data with mathematical models, and implements them using neural networks or
graph networks. Here, we review some of the prevailing trends in embedding
physics into machine learning, using physics-informed neural networks (PINNs)
based primarily on feed-forward neural networks and automatic differentiation.
For more complex systems or systems of systems and unstructured data, graph
neural networks (GNNs) present some distinct advantages, and here we review how
physics-informed learning can be accomplished with GNNs based on graph exterior
calculus to construct differential operators; we refer to these architectures
as physics-informed graph networks (PIGNs). We present representative examples
for both forward and inverse problems and discuss what advances are needed to
scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering
problems.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Physics-Guided, Physics-Informed, and Physics-Encoded Neural Networks in
Scientific Computing [0.0]
Recent breakthroughs in computing power have made it feasible to use machine learning and deep learning to advance scientific computing.
Due to their intrinsic architecture, conventional neural networks cannot be successfully trained and scoped when data is sparse.
Neural networks offer a strong foundation to digest physical-driven or knowledge-based constraints.
arXiv Detail & Related papers (2022-11-14T15:44:07Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Physics-informed ConvNet: Learning Physical Field from a Shallow Neural
Network [0.180476943513092]
Modelling and forecasting multi-physical systems remain a challenge due to unavoidable data scarcity and noise.
New framework named physics-informed convolutional network (PICN) is recommended from a CNN perspective.
PICN may become an alternative neural network solver in physics-informed machine learning.
arXiv Detail & Related papers (2022-01-26T14:35:58Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - A neural anisotropic view of underspecification in deep learning [60.119023683371736]
We show that the way neural networks handle the underspecification of problems is highly dependent on the data representation.
Our results highlight that understanding the architectural inductive bias in deep learning is fundamental to address the fairness, robustness, and generalization of these systems.
arXiv Detail & Related papers (2021-04-29T14:31:09Z) - A deep learning theory for neural networks grounded in physics [2.132096006921048]
We argue that building large, fast and efficient neural networks on neuromorphic architectures requires rethinking the algorithms to implement and train them.
Our framework applies to a very broad class of models, namely systems whose state or dynamics are described by variational equations.
arXiv Detail & Related papers (2021-03-18T02:12:48Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Malicious Network Traffic Detection via Deep Learning: An Information
Theoretic View [0.0]
We study how homeomorphism affects learned representation of a malware traffic dataset.
Our results suggest that although the details of learned representations and the specific coordinate system defined over the manifold of all parameters differ slightly, the functional approximations are the same.
arXiv Detail & Related papers (2020-09-16T15:37:44Z) - Transfer learning based multi-fidelity physics informed deep neural
network [0.0]
The governing differential equation is either not known or known in an approximate sense.
This paper presents a novel multi-fidelity physics informed deep neural network (MF-PIDNN)
MF-PIDNN blends physics informed and data-driven deep learning techniques by using the concept of transfer learning.
arXiv Detail & Related papers (2020-05-19T13:57:48Z) - Deep Learning for Ultra-Reliable and Low-Latency Communications in 6G
Networks [84.2155885234293]
We first summarize how to apply data-driven supervised deep learning and deep reinforcement learning in URLLC.
To address these open problems, we develop a multi-level architecture that enables device intelligence, edge intelligence, and cloud intelligence for URLLC.
arXiv Detail & Related papers (2020-02-22T14:38:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.