Hard-Constrained Neural Networks with Physics-Embedded Architecture for Residual Dynamics Learning and Invariant Enforcement in Cyber-Physical Systems
- URL: http://arxiv.org/abs/2511.23307v1
- Date: Fri, 28 Nov 2025 16:06:24 GMT
- Title: Hard-Constrained Neural Networks with Physics-Embedded Architecture for Residual Dynamics Learning and Invariant Enforcement in Cyber-Physical Systems
- Authors: Enzo Nicolás Spotorno, Josafat Leal Filho, Antônio Augusto Fröhlich,
- Abstract summary: We formalize the Hybrid Recurrent Physics-Informed Neural Network (HRPINN), a general-purpose architecture that embeds known physics as a hard structural constraint within a recurrent integrator to learn only residual dynamics.<n>Second, we introduce the Projected HRPINN (PHRPINN), a novel extension that integrates a predict-project mechanism to strictly enforce algebraic invariants by design.<n>We validate HRPINN on a real-world battery prognostics DAE and evaluate PHRPINN on a suite of standard constrained benchmarks.
- Score: 0.5735035463793009
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a framework for physics-informed learning in complex cyber-physical systems governed by differential equations with both unknown dynamics and algebraic invariants. First, we formalize the Hybrid Recurrent Physics-Informed Neural Network (HRPINN), a general-purpose architecture that embeds known physics as a hard structural constraint within a recurrent integrator to learn only residual dynamics. Second, we introduce the Projected HRPINN (PHRPINN), a novel extension that integrates a predict-project mechanism to strictly enforce algebraic invariants by design. The framework is supported by a theoretical analysis of its representational capacity. We validate HRPINN on a real-world battery prognostics DAE and evaluate PHRPINN on a suite of standard constrained benchmarks. The results demonstrate the framework's potential for achieving high accuracy and data efficiency, while also highlighting critical trade-offs between physical consistency, computational cost, and numerical stability, providing practical guidance for its deployment.
Related papers
- A Physics-Informed U-net-LSTM Network for Data-Driven Seismic Response Modeling of Structures [0.0]
Recent developments in deep learning have shown promise in reducing the computational cost of nonlinear seismic analysis of structures.<n>We propose a novel Physics Informed U Net LSTM framework that integrates physical laws with deep learning to enhance both accuracy and efficiency.
arXiv Detail & Related papers (2025-11-26T11:05:42Z) - Unlocking Out-of-Distribution Generalization in Dynamics through Physics-Guided Augmentation [46.40087254928057]
We present SPARK, a physics-guided quantitative augmentation plugin.<n>Experiments on diverse benchmarks demonstrate that SPARK significantly outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2025-10-28T09:30:35Z) - LNN-PINN: A Unified Physics-Only Training Framework with Liquid Residual Blocks [1.6249267147413524]
LNN-PINN is a physics-informed neural network framework that incorporates a liquid residual gating architecture.<n>Across four benchmark problems, LNN-PINN consistently reduced RMSE and MAE under identical training conditions.
arXiv Detail & Related papers (2025-08-12T13:35:46Z) - GausSim: Foreseeing Reality by Gaussian Simulator for Elastic Objects [55.02281855589641]
GausSim is a novel neural network-based simulator designed to capture the dynamic behaviors of real-world elastic objects represented through Gaussian kernels.<n>We leverage continuum mechanics and treat each kernel as a Center of Mass System (CMS) that represents continuous piece of matter.<n>In addition, GausSim incorporates explicit physics constraints, such as mass and momentum conservation, ensuring interpretable results and robust, physically plausible simulations.
arXiv Detail & Related papers (2024-12-23T18:58:17Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Physics Encoded Blocks in Residual Neural Network Architectures for Digital Twin Models [2.8720819157502344]
Physics Informed Machine Learning has emerged as a popular approach for modeling and simulation in digital twins.<n>This paper presents a generic approach based on a novel physics-encoded residual neural network architecture.<n>Our method integrates differentiable physics blocks-implementing mathematical operators from physics-based models with feed-forward learning blocks.
arXiv Detail & Related papers (2024-11-18T11:58:20Z) - PhyRecon: Physically Plausible Neural Scene Reconstruction [81.73129450090684]
We introduce PHYRECON, the first approach to leverage both differentiable rendering and differentiable physics simulation to learn implicit surface representations.
Central to this design is an efficient transformation between SDF-based implicit representations and explicit surface points.
Our results also exhibit superior physical stability in physical simulators, with at least a 40% improvement across all datasets.
arXiv Detail & Related papers (2024-04-25T15:06:58Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Explainable Equivariant Neural Networks for Particle Physics: PELICAN [51.02649432050852]
PELICAN is a novel permutation equivariant and Lorentz invariant aggregator network.
We present a study of the PELICAN algorithm architecture in the context of both tagging (classification) and reconstructing (regression) Lorentz-boosted top quarks.
We extend the application of PELICAN to the tasks of identifying quark-initiated vs.gluon-initiated jets, and a multi-class identification across five separate target categories of jets.
arXiv Detail & Related papers (2023-07-31T09:08:40Z) - MINN: Learning the dynamics of differential-algebraic equations and application to battery modeling [2.1303885995425635]
We propose a novel machine learning architecture, termed model-integrated neural networks (MINN)<n>MINN learns the physics-based dynamics of general autonomous or non-autonomous systems consisting of partial differential-algebraic equations (PDAEs)<n>We apply the proposed neural network architecture to model the electrochemical dynamics of lithium-ion batteries.
arXiv Detail & Related papers (2023-04-27T09:11:40Z) - Physically Consistent Neural ODEs for Learning Multi-Physics Systems [0.0]
In this paper, we leverage the framework of Irreversible port-Hamiltonian Systems (IPHS), which can describe most multi-physics systems.
We propose Physically Consistent NODEs (PC-NODEs) to learn parameters from data.
We demonstrate the effectiveness of the proposed method by learning the thermodynamics of a building from the real-world measurements.
arXiv Detail & Related papers (2022-11-11T11:20:35Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.