A discrete physics-informed training for projection-based reduced order models with neural networks
- URL: http://arxiv.org/abs/2504.13875v1
- Date: Mon, 31 Mar 2025 23:46:39 GMT
- Title: A discrete physics-informed training for projection-based reduced order models with neural networks
- Authors: N. Sibuet, S. Ares de Parga, J. R. Bravo, R. Rossi,
- Abstract summary: This paper presents a physics-informed training framework for projection-based Reduced Order Models (ROMs)<n>We extend the PROM-ANN architecture by complementing snapshot-based training with a FEM-based, discrete physics-informed residual loss.<n>The modified PROM-ANN outperforms POD by orders of magnitude in snapshot reconstruction accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a physics-informed training framework for projection-based Reduced Order Models (ROMs). We extend the PROM-ANN architecture by complementing snapshot-based training with a FEM-based, discrete physics-informed residual loss, bridging the gap between traditional projection-based ROMs and physics-informed neural networks (PINNs). Unlike conventional PINNs that rely on analytical PDEs, our approach leverages FEM residuals to guide the learning of the ROM approximation manifold. Key contributions include: (1) a parameter-agnostic, discrete residual loss applicable to non-linear problems, (2) an architectural modification to PROM-ANN improving accuracy for fast-decaying singular values, and (3) an empirical study on the proposed physics informed training process for ROMs. The method is demonstrated on a non-linear hyperelasticity problem, simulating a rubber cantilever under multi-axial loads. The main accomplishment in regards to the proposed residual-based loss is its applicability on non-linear problems by interfacing with FEM software while maintaining reasonable training times. The modified PROM-ANN outperforms POD by orders of magnitude in snapshot reconstruction accuracy, while the original formulation is not able to learn a proper mapping for this use-case. Finally, the application of physics informed training in ANN-PROM modestly narrows the gap between data reconstruction and ROM accuracy, however it highlights the untapped potential of the proposed residual-driven optimization for future ROM development. This work underscores the critical role of FEM residuals in ROM construction and calls for further exploration on architectures beyond PROM-ANN.
Related papers
- Fast Training of Recurrent Neural Networks with Stationary State Feedbacks [48.22082789438538]
Recurrent neural networks (RNNs) have recently demonstrated strong performance and faster inference than Transformers.
We propose a novel method that replaces BPTT with a fixed gradient feedback mechanism.
arXiv Detail & Related papers (2025-03-29T14:45:52Z) - Stable Weight Updating: A Key to Reliable PDE Solutions Using Deep Learning [0.0]
This paper introduces novel residual-based architectures, designed to enhance stability and accuracy in physics-informed neural networks (PINNs)
The architectures augment traditional neural networks by incorporating residual connections, which facilitate smoother weight updates and improve backpropagation efficiency.
The Squared Residual Network, in particular, exhibits robust performance, achieving enhanced stability and accuracy compared to conventional neural networks.
arXiv Detail & Related papers (2024-07-10T05:20:43Z) - PTPI-DL-ROMs: pre-trained physics-informed deep learning-based reduced order models for nonlinear parametrized PDEs [0.6827423171182154]
In this paper, we consider a major extension of POD-DL-ROMs by making them physics-informed.
We first complement POD-DL-ROMs with a trunk net architecture, endowing them with the ability to compute the problem's solution at every point in the spatial domain.
In particular, we take advantage of the few available data to develop a low-cost pre-training procedure.
arXiv Detail & Related papers (2024-05-14T12:46:12Z) - Ensemble Kalman Filtering Meets Gaussian Process SSM for Non-Mean-Field and Online Inference [47.460898983429374]
We introduce an ensemble Kalman filter (EnKF) into the non-mean-field (NMF) variational inference framework to approximate the posterior distribution of the latent states.
This novel marriage between EnKF and GPSSM not only eliminates the need for extensive parameterization in learning variational distributions, but also enables an interpretable, closed-form approximation of the evidence lower bound (ELBO)
We demonstrate that the resulting EnKF-aided online algorithm embodies a principled objective function by ensuring data-fitting accuracy while incorporating model regularizations to mitigate overfitting.
arXiv Detail & Related papers (2023-12-10T15:22:30Z) - Slow Invariant Manifolds of Singularly Perturbed Systems via
Physics-Informed Machine Learning [0.0]
We present a physics-informed machine-learning (PIML) approach for the approximation of slow invariant manifold (SIMs) of singularly perturbed systems.
We show that the proposed PIML scheme provides approximations, of equivalent or even higher accuracy, than those provided by other traditional GSPT-based methods.
A comparison of the computational costs between symbolic, automatic and numerical approximation of the required derivatives in the learning process is also provided.
arXiv Detail & Related papers (2023-09-14T14:10:22Z) - Efficient and Flexible Neural Network Training through Layer-wise Feedback Propagation [49.44309457870649]
We present Layer-wise Feedback Propagation (LFP), a novel training principle for neural network-like predictors.<n>LFP decomposes a reward to individual neurons based on their respective contributions to solving a given task.<n>Our method then implements a greedy approach reinforcing helpful parts of the network and weakening harmful ones.
arXiv Detail & Related papers (2023-08-23T10:48:28Z) - Physics-Driven Turbulence Image Restoration with Stochastic Refinement [80.79900297089176]
Image distortion by atmospheric turbulence is a critical problem in long-range optical imaging systems.
Fast and physics-grounded simulation tools have been introduced to help the deep-learning models adapt to real-world turbulence conditions.
This paper proposes the Physics-integrated Restoration Network (PiRN) to help the network to disentangle theity from the degradation and the underlying image.
arXiv Detail & Related papers (2023-07-20T05:49:21Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Deep-HyROMnet: A deep learning-based operator approximation for
hyper-reduction of nonlinear parametrized PDEs [0.0]
We propose a strategy for learning nonlinear ROM operators using deep neural networks (DNNs)
The resulting hyper-reduced order model enhanced by DNNs is referred to as Deep-HyROMnet.
Numerical results show that Deep-HyROMnets are orders of magnitude faster than POD-GalerkinDEIMs, keeping the same level of accuracy.
arXiv Detail & Related papers (2022-02-05T23:45:25Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - A Theoretical Framework for Target Propagation [75.52598682467817]
We analyze target propagation (TP), a popular but not yet fully understood alternative to backpropagation (BP)
Our theory shows that TP is closely related to Gauss-Newton optimization and thus substantially differs from BP.
We provide a first solution to this problem through a novel reconstruction loss that improves feedback weight training.
arXiv Detail & Related papers (2020-06-25T12:07:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.