GIMLET: Generalizable and Interpretable Model Learning through Embedded Thermodynamics
- URL: http://arxiv.org/abs/2512.19936v2
- Date: Tue, 30 Dec 2025 18:52:57 GMT
- Title: GIMLET: Generalizable and Interpretable Model Learning through Embedded Thermodynamics
- Authors: Suguru Shiratori, Elham Kiyani, Khemraj Shukla, George Em Karniadakis,
- Abstract summary: We develop a data-driven framework for discovering relations in models of fluid flow and scalar transport.<n>Under the assumption that velocity and/or scalar fields are measured, our approach infers unknown closure terms in the governing equations as neural networks.<n>The framework is demonstrated on several benchmark systems, including the Burgers equation, the Kuramoto-Sivashinsky equation, and the incompressible Navier--Stokes equations for both Newtonian and non-Newtonian fluids.
- Score: 4.032628442222966
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We develop a data-driven framework for discovering constitutive relations in models of fluid flow and scalar transport. Under the assumption that velocity and/or scalar fields are measured, our approach infers unknown closure terms in the governing equations as neural networks. The target to be discovered is the constitutive relations only, while the temporal derivative, convective transport terms, and pressure-gradient term in the governing equations are prescribed. The formulation is rooted in a variational principle from non-equilibrium thermodynamics, where the dynamics is defined by a free-energy functional and a dissipation functional. The unknown constitutive terms arise as functional derivatives of these functionals with respect to the state variables. To enable a flexible and structured model discovery, the free-energy and dissipation functionals are parameterized using neural networks, while their functional derivatives are obtained via automatic differentiation. This construction enforces thermodynamic consistency by design, guaranteeing monotonic decay of the total free energy and non-negative entropy production. The resulting method, termed GIMLET (Generalizable and Interpretable Model Learning through Embedded Thermodynamics), avoids reliance on a predefined library of candidate functions, unlike sparse regression or symbolic identification approaches. The learned models are generalizable in that functionals identified from one dataset can be transferred to distinct datasets governed by the same underlying equations. Moreover, the inferred free-energy and dissipation functions provide direct physical interpretability of the learned dynamics. The framework is demonstrated on several benchmark systems, including the viscous Burgers equation, the Kuramoto--Sivashinsky equation, and the incompressible Navier--Stokes equations for both Newtonian and non-Newtonian fluids.
Related papers
- A Free Probabilistic Framework for Denoising Diffusion Models: Entropy, Transport, and Reverse Processes [22.56299060022639]
This paper builds on Voiculescu's theory of free entropy and free Fisher information.<n>We formulate diffusion and quantify reverse processes governed by operator-valued dynamics.<n>The resulting dynamics admit a gradient-flow structure in the noncommutative Wasserstein space.
arXiv Detail & Related papers (2025-10-26T18:03:54Z) - A Solvable Molecular Switch Model for Stable Temporal Information Processing [0.0]
The linear-in-the-state and nonlinear-in-the-input model is exactly solvable, and it also possesses mathematical properties of convergence and fading memory.<n>The results give theoretical support for the use of the dynamic molecular switches as computational units in deep cascaded/layered feedforward and recurrent architectures.<n>They could also inspire more general exactly solvable models that can be fitted to emulate arbitrary physical devices which can mimic brain-inspired behaviour and perform stable computation on input signals.
arXiv Detail & Related papers (2025-08-21T11:13:56Z) - Eigenstate Thermalization Hypothesis correlations via non-linear Hydrodynamics [0.0]
We provide a prediction for the late-time behavior of time-ordered free cumulants in the thermodynamic limit.<n>Good agreement is observed in both infinite and finite-temperature regimes.
arXiv Detail & Related papers (2025-05-11T06:35:16Z) - No Equations Needed: Learning System Dynamics Without Relying on Closed-Form ODEs [56.78271181959529]
This paper proposes a conceptual shift to modeling low-dimensional dynamical systems by departing from the traditional two-step modeling process.<n>Instead of first discovering a closed-form equation and then analyzing it, our approach, direct semantic modeling, predicts the semantic representation of the dynamical system.<n>Our approach not only simplifies the modeling pipeline but also enhances the transparency and flexibility of the resulting models.
arXiv Detail & Related papers (2025-01-30T18:36:48Z) - Finding the Underlying Viscoelastic Constitutive Equation via Universal Differential Equations and Differentiable Physics [1.03121181235382]
This research employs Universal Differential Equations (UDEs) alongside differentiable physics to viscoelastic fluids.<n>This study focuses on analyzing four viscoelastic models: Upper Convected Maxwell (UCM), Johnson-Segalman, Giesekus, and Exponential Phan-Thien-Tanner (ePTT)
arXiv Detail & Related papers (2024-12-31T17:34:29Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Shape Arithmetic Expressions: Advancing Scientific Discovery Beyond Closed-Form Equations [56.78271181959529]
Generalized Additive Models (GAMs) can capture non-linear relationships between variables and targets, but they cannot capture intricate feature interactions.
We propose Shape Expressions Arithmetic ( SHAREs) that fuses GAM's flexible shape functions with the complex feature interactions found in mathematical expressions.
We also design a set of rules for constructing SHAREs that guarantee transparency of the found expressions beyond the standard constraints.
arXiv Detail & Related papers (2024-04-15T13:44:01Z) - Symmetry-regularized neural ordinary differential equations [0.0]
This paper introduces new conservation relations in Neural ODEs using Lie symmetries in both the hidden state dynamics and the back propagation dynamics.
These conservation laws are then incorporated into the loss function as additional regularization terms, potentially enhancing the physical interpretability and generalizability of the model.
New loss functions are constructed from these conservation relations, demonstrating the applicability symmetry-regularized Neural ODE in typical modeling tasks.
arXiv Detail & Related papers (2023-11-28T09:27:44Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Machine learning of hidden variables in multiscale fluid simulation [77.34726150561087]
Solving fluid dynamics equations often requires the use of closure relations that account for missing microphysics.
In our study, a partial differential equation simulator that is end-to-end differentiable is used to train judiciously placed neural networks.
We show that this method enables an equation based approach to reproduce non-linear, large Knudsen number plasma physics.
arXiv Detail & Related papers (2023-06-19T06:02:53Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Automatically Polyconvex Strain Energy Functions using Neural Ordinary
Differential Equations [0.0]
Deep neural networks are able to learn complex material without the constraints of form approximations.
N-ODE material model is able to capture synthetic data generated from closedform material models.
framework can be used to model a large class of materials.
arXiv Detail & Related papers (2021-10-03T13:11:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.