Physically consistent model learning for reaction-diffusion systems
- URL: http://arxiv.org/abs/2512.14240v1
- Date: Tue, 16 Dec 2025 09:51:41 GMT
- Title: Physically consistent model learning for reaction-diffusion systems
- Authors: Erion Morina, Martin Holler,
- Abstract summary: This paper builds on a regularization-based framework for structured model learning.<n>We investigate how to incorporate key physical properties, such as mass conservation and quasipositivity, directly into the learning process.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper addresses the problem of learning reaction-diffusion (RD) systems from data while ensuring physical consistency and well-posedness of the learned models. Building on a regularization-based framework for structured model learning, we focus on learning parameterized reaction terms and investigate how to incorporate key physical properties, such as mass conservation and quasipositivity, directly into the learning process. Our main contributions are twofold: First, we propose techniques to systematically modify a given class of parameterized reaction terms such that the resulting terms inherently satisfy mass conservation and quasipositivity, ensuring that the learned RD systems preserve non-negativity and adhere to physical principles. These modifications also guarantee well-posedness of the resulting PDEs under additional regularity and growth conditions. Second, we extend existing theoretical results on regularization-based model learning to RD systems using these physically consistent reaction terms. Specifically, we prove that solutions to the learning problem converge to a unique, regularization-minimizing solution of a limit system even when conservation laws and quasipositivity are enforced. In addition, we provide approximation results for quasipositive functions, essential for constructing physically consistent parameterizations. These results advance the development of interpretable and reliable data-driven models for RD systems that align with fundamental physical laws.
Related papers
- Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge [8.269904705399474]
Recent advances in machine learning have enabled neural operators to serve as powerful surrogates for modeling the evolution of physical systems.<n>We propose a multiphysics training framework that jointly learns from both the original PDEs and their simplified basic forms.<n>Our framework enhances data efficiency, reduces predictive errors, and improves out-of-distribution (OOD) generalization.
arXiv Detail & Related papers (2026-02-16T20:45:10Z) - Deep Neural Networks as Iterated Function Systems and a Generalization Bound [2.7920304852537536]
We show that two important deep architectures can be viewed as, or canonically associated with, place-dependent IFS.<n>We derive a Wasserstein bound for generative modeling that controls the collage-type approximation error between the data distribution and its image.
arXiv Detail & Related papers (2026-01-27T07:32:49Z) - Hierarchical Physics-Embedded Learning for Spatiotemporal Dynamical Systems [12.832325257647128]
We propose a hierarchical physics-embedded learning framework that advances both forwardtemporal prediction and inverse discovery of physical laws.<n>Known physical laws are directly embedded into the models computational graph, guaranteeing physical consistency.<n>By building the framework upon adaptive Neural Operators, we can effectively capture the non-local dependencies and high-order operators characteristic of dynamical systems.
arXiv Detail & Related papers (2025-10-29T09:18:41Z) - From Physics to Machine Learning and Back: Part II - Learning and Observational Bias in PHM [52.64097278841485]
Review examines how incorporating learning and observational biases through physics-informed modeling and data strategies can guide models toward physically consistent and reliable predictions.<n>Fast adaptation methods including meta-learning and few-shot learning are reviewed alongside domain generalization techniques.
arXiv Detail & Related papers (2025-09-25T14:15:43Z) - Information-Theoretic Bounds and Task-Centric Learning Complexity for Real-World Dynamic Nonlinear Systems [0.6875312133832079]
Dynamic nonlinear systems exhibit distortions arising from coupled static and dynamic effects.<n>This paper presents a theoretical framework grounded in structured decomposition, variance analysis, and task-centric complexity bounds.
arXiv Detail & Related papers (2025-09-08T12:08:02Z) - Meta-learning Structure-Preserving Dynamics [6.088897644268474]
We introduce a modulation-based meta-learning framework that conditions structure-preserving models on compact latent representations of potentially unknown system parameters.<n>We enable scalable and generalizable learning across parametric families of dynamical systems.
arXiv Detail & Related papers (2025-08-15T04:30:27Z) - Learning and Transferring Physical Models through Derivatives [61.227256589854726]
We propose Derivative Learning (DERL), a supervised approach that models physical systems by learning their partial derivatives.<n>We also leverage DERL to build physical models incrementally, by designing a distillation protocol that effectively transfers knowledge from a pre-trained model to a student one.
arXiv Detail & Related papers (2025-05-02T17:02:00Z) - Physics-Informed Regularization for Domain-Agnostic Dynamical System Modeling [41.82469276824927]
We present a framework that achieves high-precision modeling for a wide range of dynamical systems.
It helps preserve energies for conservative systems while serving as a strong inductive bias for non-conservative, reversible systems.
By integrating the TRS loss within neural ordinary differential equation models, the proposed model TREAT demonstrates superior performance on diverse physical systems.
arXiv Detail & Related papers (2024-10-08T21:04:01Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.