Physics-Informed Neural Nets-based Control
- URL: http://arxiv.org/abs/2104.02556v1
- Date: Tue, 6 Apr 2021 14:55:23 GMT
- Title: Physics-Informed Neural Nets-based Control
- Authors: Eric Aislan Antonelo, Eduardo Camponogara, Laio Oriel Seman, Eduardo
Rehbein de Souza, Jean P. Jordanou, Jomi F. Hubner
- Abstract summary: This work presents a new framework called Physics-Informed Neural Nets-based Control (PINC)
PINC is amenable to control problems and able to simulate for longer-range time horizons that are not fixed beforehand.
We showcase our method in the control of two nonlinear dynamic systems.
- Score: 5.252190504926357
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) impose known physical laws into the
learning of deep neural networks, making sure they respect the physics of the
process while decreasing the demand of labeled data. For systems represented by
Ordinary Differential Equations (ODEs), the conventional PINN has a continuous
time input variable and outputs the solution of the corresponding ODE. In their
original form, PINNs do not allow control inputs neither can they simulate for
long-range intervals without serious degradation in their predictions. In this
context, this work presents a new framework called Physics-Informed Neural
Nets-based Control (PINC), which proposes a novel PINN-based architecture that
is amenable to control problems and able to simulate for longer-range time
horizons that are not fixed beforehand. First, the network is augmented with
new inputs to account for the initial state of the system and the control
action. Then, the response over the complete time horizon is split such that
each smaller interval constitutes a solution of the ODE conditioned on the
fixed values of initial state and control action. The complete response is
formed by setting the initial state of the next interval to the terminal state
of the previous one. The new methodology enables the optimal control of dynamic
systems, making feasible to integrate a priori knowledge from experts and data
collected from plants in control applications. We showcase our method in the
control of two nonlinear dynamic systems: the Van der Pol oscillator and the
four-tank system.
Related papers
- Physics-Informed Echo State Networks for Modeling Controllable Dynamical Systems [0.0]
Physics-Informed ESNs (PI-ESNs) were proposed initially to model chaotic dynamic systems without external inputs.
PI-ESNs can regularize an ESN model with external inputs previously trained on just a few datapoints, reducing its overfitting and improving its generalization error.
arXiv Detail & Related papers (2024-09-27T21:06:24Z) - GradINN: Gradient Informed Neural Network [2.287415292857564]
We propose a methodology inspired by Physics Informed Neural Networks (PINNs)
GradINNs leverage prior beliefs about a system's gradient to constrain the predicted function's gradient across all input dimensions.
We demonstrate the advantages of GradINNs, particularly in low-data regimes, on diverse problems spanning non time-dependent systems.
arXiv Detail & Related papers (2024-09-03T14:03:29Z) - Domain-decoupled Physics-informed Neural Networks with Closed-form Gradients for Fast Model Learning of Dynamical Systems [2.8730926763860687]
Physics-informed neural networks (PINNs) are trained using physical equations and can incorporate unmodeled effects by learning from data.
We introduce the domain-decoupled physics-informed neural network (DD-PINN) to address current limitations of PINC in handling large and complex nonlinear dynamical systems.
arXiv Detail & Related papers (2024-08-27T10:54:51Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Fluid Batching: Exit-Aware Preemptive Serving of Early-Exit Neural
Networks on Edge NPUs [74.83613252825754]
"smart ecosystems" are being formed where sensing happens concurrently rather than standalone.
This is shifting the on-device inference paradigm towards deploying neural processing units (NPUs) at the edge.
We propose a novel early-exit scheduling that allows preemption at run time to account for the dynamicity introduced by the arrival and exiting processes.
arXiv Detail & Related papers (2022-09-27T15:04:01Z) - Physics-informed Neural Networks-based Model Predictive Control for
Multi-link Manipulators [0.0]
We discuss nonlinear model predictive control (NMPC) for multi-body dynamics via physics-informed machine learning methods.
We present the idea of enhancing PINNs by adding control actions and initial conditions as additional network inputs.
We present our results using our PINN-based MPC to solve a tracking problem for a complex mechanical system.
arXiv Detail & Related papers (2021-09-22T15:31:24Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Which Neural Network to Choose for Post-Fault Localization, Dynamic
State Estimation and Optimal Measurement Placement in Power Systems? [4.416484585765027]
We consider a power transmission system monitored with Phasor Measurement Units (PMUs) placed at significant, but not all, nodes of the system.
We first design a comprehensive sequence of Neural Networks (NNs) locating the faulty line.
Second, we build a sequence of advanced Power-System-Dynamics-Informed and Neural-ODE based Machine Learning schemes trained, given pre-fault state, to predict the post-fault state.
Third, and continuing to work with the first (fault localization) setting we design a (NN-based) algorithm which discovers optimal PMU placement.
arXiv Detail & Related papers (2021-04-07T13:35:55Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.