AutoPINN: When AutoML Meets Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2212.04058v1
- Date: Thu, 8 Dec 2022 03:44:08 GMT
- Title: AutoPINN: When AutoML Meets Physics-Informed Neural Networks
- Authors: Xinle Wu, Dalin Zhang, Miao Zhang, Chenjuan Guo, Shuai Zhao, Yi Zhang,
Huai Wang, Bin Yang
- Abstract summary: PINNs enable the estimation of critical parameters, which are unobservable via physical tools, through observable variables.
Existing PINNs are often manually designed, which is time-consuming and may lead to suboptimal performance.
We propose a framework that enables the automated design of PINNs by combining AutoML and PINNs.
- Score: 30.798918516407376
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-Informed Neural Networks (PINNs) have recently been proposed to solve
scientific and engineering problems, where physical laws are introduced into
neural networks as prior knowledge. With the embedded physical laws, PINNs
enable the estimation of critical parameters, which are unobservable via
physical tools, through observable variables. For example, Power Electronic
Converters (PECs) are essential building blocks for the green energy
transition. PINNs have been applied to estimate the capacitance, which is
unobservable during PEC operations, using current and voltage, which can be
observed easily during operations. The estimated capacitance facilitates
self-diagnostics of PECs. Existing PINNs are often manually designed, which is
time-consuming and may lead to suboptimal performance due to a large number of
design choices for neural network architectures and hyperparameters. In
addition, PINNs are often deployed on different physical devices, e.g., PECs,
with limited and varying resources. Therefore, it requires designing different
PINN models under different resource constraints, making it an even more
challenging task for manual design. To contend with the challenges, we propose
Automated Physics-Informed Neural Networks (AutoPINN), a framework that enables
the automated design of PINNs by combining AutoML and PINNs. Specifically, we
first tailor a search space that allows finding high-accuracy PINNs for PEC
internal parameter estimation. We then propose a resource-aware search strategy
to explore the search space to find the best PINN model under different
resource constraints. We experimentally demonstrate that AutoPINN is able to
find more accurate PINN models than human-designed, state-of-the-art PINN
models using fewer resources.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Correcting model misspecification in physics-informed neural networks
(PINNs) [2.07180164747172]
We present a general approach to correct the misspecified physical models in PINNs for discovering governing equations.
We employ other deep neural networks (DNNs) to model the discrepancy between the imperfect models and the observational data.
We envision that the proposed approach will extend the applications of PINNs for discovering governing equations in problems where the physico-chemical or biological processes are not well understood.
arXiv Detail & Related papers (2023-10-16T19:25:52Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Separable PINN: Mitigating the Curse of Dimensionality in
Physics-Informed Neural Networks [6.439575695132489]
Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems.
We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN.
We propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation.
arXiv Detail & Related papers (2022-11-16T08:46:52Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Physics-Informed Neural Networks (PINNs) for Parameterized PDEs: A
Metalearning Approach [13.590496719224987]
Physics-informed neural networks (PINNs) are a means of discretizing partial differential equations (PDEs)
We present a survey of model-agnostic metalearning, and then discuss our model-aware metalearning applied to PINNs.
arXiv Detail & Related papers (2021-10-26T02:29:10Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Constructing Deep Neural Networks with a Priori Knowledge of Wireless
Tasks [37.060397377445504]
Two kinds of permutation invariant properties widely existed in wireless tasks can be harnessed to reduce the number of model parameters.
We find special architecture of DNNs whose input-output relationships satisfy the properties, called permutation invariant DNN (PINN)
We take predictive resource allocation and interference coordination as examples to show how the PINNs can be employed for learning the optimal policy with unsupervised and supervised learning.
arXiv Detail & Related papers (2020-01-29T08:54:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.