Neural Entropy-stable conservative flux form neural networks for learning hyperbolic conservation laws
- URL: http://arxiv.org/abs/2507.01795v1
- Date: Wed, 02 Jul 2025 15:18:04 GMT
- Title: Neural Entropy-stable conservative flux form neural networks for learning hyperbolic conservation laws
- Authors: Lizuo Liu, Lu Zhang, Anne Gelb,
- Abstract summary: We propose a neural entropy-stable conservative flux form neural network (NESCFN) for learning hyperbolic conservation laws.<n>Our approach removes this dependency by embedding entropy-stable design principles into the learning process itself.
- Score: 2.8680286413498903
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a neural entropy-stable conservative flux form neural network (NESCFN) for learning hyperbolic conservation laws and their associated entropy functions directly from solution trajectories, without requiring any predefined numerical discretization. While recent neural network architectures have successfully integrated classical numerical principles into learned models, most rely on prior knowledge of the governing equations or assume a fixed discretization. Our approach removes this dependency by embedding entropy-stable design principles into the learning process itself, enabling the discovery of physically consistent dynamics in a fully data-driven setting. By jointly learning both the numerical flux function and a corresponding entropy, the proposed method ensures conservation and entropy dissipation, critical for long-term stability and fidelity in the system of hyperbolic conservation laws. Numerical results demonstrate that the method achieves stability and conservation over extended time horizons and accurately captures shock propagation speeds, even without oracle access to future-time solution profiles in the training data.
Related papers
- Data-Driven Adaptive Gradient Recovery for Unstructured Finite Volume Computations [0.0]
We present a novel data-driven approach for enhancing gradient reconstruction in unstructured finite volume methods for hyperbolic conservation laws.<n>Our approach extends previous structured-grid methodologies to unstructured meshes through a modified DeepONet architecture.<n>The proposed algorithm is faster and more accurate than the traditional second-order finite volume solver.
arXiv Detail & Related papers (2025-07-22T13:23:57Z) - From Initial Data to Boundary Layers: Neural Networks for Nonlinear Hyperbolic Conservation Laws [0.0]
We address the approximation of entropy solutions to initial-boundary value problems for nonlinear strictly hyperbolic conservation laws using neural networks.<n>A general and systematic framework is introduced for the design of efficient and reliable learning algorithms, combining fast convergence during training with accurate predictions.
arXiv Detail & Related papers (2025-06-02T09:12:13Z) - Implicit Neural Differential Model for Spatiotemporal Dynamics [5.1854032131971195]
We introduce Im-PiNDiff, a novel implicit physics-integrated neural differentiable solver for stabletemporal dynamics.<n>Inspired by deep equilibrium models, Im-PiNDiff advances the state using implicit fixed-point layers, enabling robust long-term simulation.<n>Im-PiNDiff achieves superior predictive performance, enhanced numerical stability, and substantial reductions in memory and cost.
arXiv Detail & Related papers (2025-04-03T04:07:18Z) - Conservation-informed Graph Learning for Spatiotemporal Dynamics Prediction [84.26340606752763]
In this paper, we introduce the conservation-informed GNN (CiGNN), an end-to-end explainable learning framework.<n>The network is designed to conform to the general symmetry conservation law via symmetry where conservative and non-conservative information passes over a multiscale space by a latent temporal marching strategy.<n>Results demonstrate that CiGNN exhibits remarkable baseline accuracy and generalizability, and is readily applicable to learning for prediction of varioustemporal dynamics.
arXiv Detail & Related papers (2024-12-30T13:55:59Z) - Symplectic Neural Flows for Modeling and Discovery [9.786274281068815]
SympFlow is a time-dependent symplectic neural network designed using parameterized Hamiltonian flow maps.<n>It allows for backward error analysis and ensures the preservation of the symplectic structure.<n>We demonstrate the effectiveness of SympFlow on diverse problems, including chaotic and dissipative systems.
arXiv Detail & Related papers (2024-12-21T22:02:00Z) - Entropy stable conservative flux form neural networks [3.417730578086946]
We propose an entropy-stable conservative flux form neural network (CFN) that integrates classical numerical conservation laws into a data-driven framework.
Numerical experiments demonstrate that the entropy-stable CFN achieves both stability and conservation while maintaining accuracy over extended time domains.
arXiv Detail & Related papers (2024-11-04T02:01:31Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - NN-EUCLID: deep-learning hyperelasticity without stress data [0.0]
We propose a new approach for unsupervised learning of hyperelastic laws with physics-consistent deep neural networks.
In contrast to supervised learning, which assumes the stress-strain, the approach only uses realistically measurable full-elastic field displacement and global force availability data.
arXiv Detail & Related papers (2022-05-04T13:54:54Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.