Investigation of Compressor Cascade Flow Using Physics- Informed Neural
Networks with Adaptive Learning Strategy
- URL: http://arxiv.org/abs/2308.04501v2
- Date: Mon, 18 Sep 2023 15:17:23 GMT
- Title: Investigation of Compressor Cascade Flow Using Physics- Informed Neural
Networks with Adaptive Learning Strategy
- Authors: Zhihui Li, Francesco Montomoli, Sanjiv Sharma
- Abstract summary: In this study, we utilize the emerging Physics Informed Neural Networks (PINNs) approach for the first time to predict the flow field of a compressor cascade.
PINNs successfully reconstruct the flow field of the compressor cascade solely based on partial velocity vectors and near-wall pressure information.
This research provides evidence that PINNs can offer turbomachinery designers an additional and promising option alongside the current dominant CFD methods.
- Score: 3.7683769965680067
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this study, we utilize the emerging Physics Informed Neural Networks
(PINNs) approach for the first time to predict the flow field of a compressor
cascade. Different from conventional training methods, a new adaptive learning
strategy that mitigates gradient imbalance through incorporating adaptive
weights in conjunction with dynamically adjusting learning rate is used during
the training process to improve the convergence of PINNs. The performance of
PINNs is assessed here by solving both the forward and inverse problems. In the
forward problem, by encapsulating the physical relations among relevant
variables, PINNs demonstrate their effectiveness in accurately forecasting the
compressor's flow field. PINNs also show obvious advantages over the
traditional CFD approaches, particularly in scenarios lacking complete boundary
conditions, as is often the case in inverse engineering problems. PINNs
successfully reconstruct the flow field of the compressor cascade solely based
on partial velocity vectors and near-wall pressure information. Furthermore,
PINNs show robust performance in the environment of various levels of aleatory
uncertainties stemming from labeled data. This research provides evidence that
PINNs can offer turbomachinery designers an additional and promising option
alongside the current dominant CFD methods.
Related papers
- ProPINN: Demystifying Propagation Failures in Physics-Informed Neural Networks [71.02216400133858]
Physics-informed neural networks (PINNs) have earned high expectations in solving partial differential equations (PDEs)
Previous research observed the propagation failure phenomenon of PINNs.
This paper provides the first formal and in-depth study of propagation failure and its root cause.
arXiv Detail & Related papers (2025-02-02T13:56:38Z) - Deep Neural Networks Tend To Extrapolate Predictably [51.303814412294514]
neural network predictions tend to be unpredictable and overconfident when faced with out-of-distribution (OOD) inputs.
We observe that neural network predictions often tend towards a constant value as input data becomes increasingly OOD.
We show how one can leverage our insights in practice to enable risk-sensitive decision-making in the presence of OOD inputs.
arXiv Detail & Related papers (2023-10-02T03:25:32Z) - Residual-based attention and connection to information bottleneck theory
in PINNs [0.393259574660092]
Physics-informed neural networks (PINNs) have seen a surge of interest in recent years.
We propose an efficient, gradient-less weighting scheme for PINNs, that accelerates the convergence of dynamic or static systems.
arXiv Detail & Related papers (2023-07-01T16:29:55Z) - RANS-PINN based Simulation Surrogates for Predicting Turbulent Flows [3.1861308132183384]
We introduce RANS-PINN, a modified PINN framework, to predict flow fields in high Reynolds number turbulent flow regimes.
To account for the additional complexity introduced by turbulence, RANS-PINN employs a 2-equation eddy viscosity model based on a Reynolds-averaged Navier-Stokes (RANS) formulation.
arXiv Detail & Related papers (2023-06-09T16:55:49Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - On the Generalization of PINNs outside the training domain and the
Hyperparameters influencing it [1.3927943269211593]
PINNs are Neural Network architectures trained to emulate solutions of differential equations without the necessity of solution data.
We perform an empirical analysis of the behavior of PINN predictions outside their training domain.
We assess whether the algorithmic setup of PINNs can influence their potential for generalization and showcase the respective effect on the prediction.
arXiv Detail & Related papers (2023-02-15T09:51:56Z) - Transfer Learning with Physics-Informed Neural Networks for Efficient
Simulation of Branched Flows [1.1470070927586016]
Physics-Informed Neural Networks (PINNs) offer a promising approach to solving differential equations.
We adopt a recently developed transfer learning approach for PINNs and introduce a multi-head model.
We show that our methods provide significant computational speedups in comparison to standard PINNs trained from scratch.
arXiv Detail & Related papers (2022-11-01T01:50:00Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - PSO-PINN: Physics-Informed Neural Networks Trained with Particle Swarm
Optimization [0.0]
We propose the use of a hybrid particle swarm optimization and gradient descent approach to train PINNs.
The resulting PSO-PINN algorithm mitigates the undesired behaviors of PINNs trained with standard gradient descent.
Experimental results show that PSO-PINN consistently outperforms a baseline PINN trained with Adam gradient descent.
arXiv Detail & Related papers (2022-02-04T02:21:31Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.