Structure-Preserving Physics-Informed Neural Network for the Korteweg--de Vries (KdV) Equation
- URL: http://arxiv.org/abs/2511.00418v1
- Date: Sat, 01 Nov 2025 06:07:24 GMT
- Title: Structure-Preserving Physics-Informed Neural Network for the Korteweg--de Vries (KdV) Equation
- Authors: Victory Obieke, Emmanuel Oguadimma,
- Abstract summary: This paper introduces a emphstructure-preserving PINN framework for the nonlinear Korteweg--de Vries (KdV) equation.<n>The proposed method embeds the conservation of mass and Hamiltonian energy directly into the loss function.<n>We successfully reproduces hallmark behaviors of KdV dynamics while maintaining conserved invariants.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Physics-Informed Neural Networks (PINNs) offer a flexible framework for solving nonlinear partial differential equations (PDEs), yet conventional implementations often fail to preserve key physical invariants during long-term integration. This paper introduces a \emph{structure-preserving PINN} framework for the nonlinear Korteweg--de Vries (KdV) equation, a prototypical model for nonlinear and dispersive wave propagation. The proposed method embeds the conservation of mass and Hamiltonian energy directly into the loss function, ensuring physically consistent and energy-stable evolution throughout training and prediction. Unlike standard \texttt{tanh}-based PINNs~\cite{raissi2019pinn,wang2022modifiedpinn}, our approach employs sinusoidal activation functions that enhance spectral expressiveness and accurately capture the oscillatory and dispersive nature of KdV solitons. Through representative case studies -- including single-soliton propagation (shape-preserving translation), two-soliton interaction (elastic collision with phase shift), and cosine-pulse initialization (nonlinear dispersive breakup) -- the model successfully reproduces hallmark behaviors of KdV dynamics while maintaining conserved invariants. Ablation studies demonstrate that combining invariant-constrained optimization with sinusoidal feature mappings accelerates convergence, improves long-term stability, and mitigates drift without multi-stage pretraining. These results highlight that computationally efficient, invariant-aware regularization coupled with sinusoidal representations yields robust, energy-consistent PINNs for Hamiltonian partial differential equations such as the KdV equation.
Related papers
- Adaptive-Growth Randomized Neural Networks for Level-Set Computation of Multivalued Nonlinear First-Order PDEs with Hyperbolic Characteristics [38.23142730599331]
This paper proposes an Adaptive-Growth Randomized Neural Network (AG-RaNN) method for computing multivalued solutions of nonlinear first-order PDEs with hyperbolic characteristics.<n>Such solutions arise in geometric optics, seismic waves, semiclassical limit of quantum dynamics and high frequency limit of linear waves.
arXiv Detail & Related papers (2026-03-01T13:16:25Z) - Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation [56.361076943802594]
CanonFlow achieves state-of-the-art performance on the challenging GEOM-DRUG dataset, and the advantage remains large in few-step generation.
arXiv Detail & Related papers (2026-02-16T18:58:55Z) - KoopGen: Koopman Generator Networks for Representing and Predicting Dynamical Systems with Continuous Spectra [65.11254608352982]
We introduce a generator-based neural Koopman framework that models dynamics through a structured, state-dependent representation of Koopman generators.<n>By exploiting the intrinsic Cartesian decomposition into skew-adjoint and self-adjoint components, KoopGen separates conservative transport from irreversible dissipation.
arXiv Detail & Related papers (2026-02-15T06:32:23Z) - Physics-Informed Chebyshev Polynomial Neural Operator for Parametric Partial Differential Equations [17.758049557300826]
We introduce the Physics-Informed Chebyshev Polynomial Neural Operator (CPNO)<n>CPNO replaces unstable monomial expansions with numerically stable Chebyshev spectral basis.<n> Experiments on benchmark parameterized PDEs show that CPNO achieves superior accuracy, faster convergence, and enhanced robustness to hyper parameters.
arXiv Detail & Related papers (2026-02-02T07:19:56Z) - An Inverse Scattering Inspired Fourier Neural Operator for Time-Dependent PDE Learning [0.0]
We introduce an inverse scattering inspired Fourier Neural Operator (IS-FNO)<n>IS-FNO achieves lower short-term errors and substantially improved long-horizon stability in non-stiff regimes.<n>Overall, this work shows that incorporating physical structure -- particularly reversibility and spectral evolution -- into neural operator design significantly enhances robustness and long-term predictive fidelity for nonlinear PDE dynamics.
arXiv Detail & Related papers (2025-12-22T14:40:13Z) - Extended Physics Informed Neural Network for Hyperbolic Two-Phase Flow in Porous Media [0.7390960543869483]
This work employs the Extended Physics-In Neural Network (XPINN) framework to solve the nonlinear Buckley-Leverett equation.<n> Coupling betweenworks is achieved through the Rankine-Hugoniot jump condition, which enforces physically consistent flux continuity.<n>Compared to standard PINNs, the XPINN framework achieves superior stability, faster convergence, and enhanced nonlinear wave dynamics.
arXiv Detail & Related papers (2025-11-05T14:16:28Z) - Fast spectral separation method for kinetic equation with anisotropic non-stationary collision operator retaining micro-model fidelity [13.462104954140088]
We present a data-driven collisional operator for one-component plasmas, learned from molecular dynamics simulations.<n>The proposed operator features an anisotropic, non-stationary collision kernel that accounts for particle correlations.<n> Numerical experiments demonstrate that the proposed model accurately captures plasma dynamics in the moderately coupled regime.
arXiv Detail & Related papers (2025-10-16T19:27:03Z) - Analysis of Fourier Neural Operators via Effective Field Theory [11.824913874212802]
We present a systematic effective field theory analysis of FNOs in an infinite dimensional function space.<n>We show that nonlinear activations inevitably couple frequency inputs to high frequency modes that are otherwise discarded by spectral truncation.<n>Our results quantify how nonlinearity enables neural operators to capture non-trivial features and explain why scale invariant activations and residual connections enhance feature learning in FNOs.
arXiv Detail & Related papers (2025-07-29T14:10:46Z) - PhysicsCorrect: A Training-Free Approach for Stable Neural PDE Simulations [4.7903561901859355]
We present PhysicsCorrect, a training-free correction framework that enforces PDE consistency at each prediction step.<n>Our key innovation is an efficient caching strategy that precomputes the Jacobian and its pseudoinverse during an offline warm-up phase.<n>Across three representative PDE systems, PhysicsCorrect reduces prediction errors by up to 100x while adding negligible inference time.
arXiv Detail & Related papers (2025-07-03T01:22:57Z) - KITINet: Kinetics Theory Inspired Network Architectures with PDE Simulation Approaches [43.872190335490515]
This paper introduces KITINet, a novel architecture that reinterprets feature propagation through the lens of non-equilibrium particle dynamics.<n>At its core, we propose a residual module that models update as the evolution of a particle system.<n>This formulation mimics particle collisions and energy exchange, enabling adaptive feature refinement via physics-informed interactions.
arXiv Detail & Related papers (2025-05-23T13:58:29Z) - Generative System Dynamics in Recurrent Neural Networks [56.958984970518564]
We investigate the continuous time dynamics of Recurrent Neural Networks (RNNs)<n>We show that skew-symmetric weight matrices are fundamental to enable stable limit cycles in both linear and nonlinear configurations.<n> Numerical simulations showcase how nonlinear activation functions not only maintain limit cycles, but also enhance the numerical stability of the system integration process.
arXiv Detail & Related papers (2025-04-16T10:39:43Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Long-time Integration of Nonlinear Wave Equations with Neural Operators [13.357441268268758]
We focus on solving the long-time integration of nonlinear wave equations via neural operators.<n>We utilize some intrinsic features of these nonlinear wave equations, such as conservation laws and well-posedness, to improve the algorithm design and reduce accumulated error.<n>Our numerical experiments examine these improvements in the Korteweg-de Vries (KdV) equation, the sine-Gordon equation, and the Klein-Gordon wave equation on the irregular domain.
arXiv Detail & Related papers (2024-10-21T03:36:34Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Convex Analysis of the Mean Field Langevin Dynamics [49.66486092259375]
convergence rate analysis of the mean field Langevin dynamics is presented.
$p_q$ associated with the dynamics allows us to develop a convergence theory parallel to classical results in convex optimization.
arXiv Detail & Related papers (2022-01-25T17:13:56Z) - On dissipative symplectic integration with applications to
gradient-based optimization [77.34726150561087]
We propose a geometric framework in which discretizations can be realized systematically.
We show that a generalization of symplectic to nonconservative and in particular dissipative Hamiltonian systems is able to preserve rates of convergence up to a controlled error.
arXiv Detail & Related papers (2020-04-15T00:36:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.