AI paradigm for solving differential equations: first-principles data generation and scale-dilation operator AI solver
- URL: http://arxiv.org/abs/2507.23141v1
- Date: Wed, 30 Jul 2025 22:45:11 GMT
- Title: AI paradigm for solving differential equations: first-principles data generation and scale-dilation operator AI solver
- Authors: Xiangshu Gong, Zhiqiang Xie, Xiaowei Jin, Chen Wang, Yanling Qu, Wangmeng Zuo, Hui Li,
- Abstract summary: We propose an AI paradigm for solving diverse differential equations (DEs)<n>Using prior knowledge or random fields, we generate solutions and then substitute them into the DEs.<n>We produce arbitrarily vast amount of, first-principles-consistent training datasets at extremely low computational cost.
- Score: 43.3784550457024
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many problems are governed by differential equations (DEs). Artificial intelligence (AI) is a new path for solving DEs. However, data is very scarce and existing AI solvers struggle with approximation of high frequency components (AHFC). We propose an AI paradigm for solving diverse DEs, including DE-ruled first-principles data generation methodology and scale-dilation operator (SDO) AI solver. Using either prior knowledge or random fields, we generate solutions and then substitute them into the DEs to derive the sources and initial/boundary conditions through balancing DEs, thus producing arbitrarily vast amount of, first-principles-consistent training datasets at extremely low computational cost. We introduce a reversible SDO that leverages the Fourier transform of the multiscale solutions to fix AHFC, and design a spatiotemporally coupled, attention-based Transformer AI solver of DEs with SDO. An upper bound on the Hessian condition number of the loss function is proven to be proportional to the squared 2-norm of the solution gradient, revealing that SDO yields a smoother loss landscape, consequently fixing AHFC with efficient training. Extensive tests on diverse DEs demonstrate that our AI paradigm achieves consistently superior accuracy over state-of-the-art methods. This work makes AI solver of DEs to be truly usable in broad nature and engineering fields.
Related papers
- Scalable, Symbiotic, AI and Non-AI Agent Based Parallel Discrete Event Simulations [0.0]
This paper introduces a novel parallel discrete event simulation (PDES) based methodology to combine multiple AI and non-AI agents.<n>We evaluate our approach by solving four problems from four different domains and comparing the results with those from AI models alone.<n>Results show that overall accuracy of our approach is 68% where as the accuracy of vanilla models is less than 23%.
arXiv Detail & Related papers (2025-05-28T17:50:01Z) - Enabling Automatic Differentiation with Mollified Graph Neural Operators [75.3183193262225]
We propose the mollified graph neural operator (mGNO), the first method to leverage automatic differentiation and compute emphexact gradients on arbitrary geometries.<n>For a PDE example on regular grids, mGNO paired with autograd reduced the L2 relative data error by 20x compared to finite differences.<n>It can also solve PDEs on unstructured point clouds seamlessly, using physics losses only, at resolutions vastly lower than those needed for finite differences to be accurate enough.
arXiv Detail & Related papers (2025-04-11T06:16:30Z) - Paving the way for scientific foundation models: enhancing generalization and robustness in PDEs with constraint-aware pre-training [49.8035317670223]
A scientific foundation model (SciFM) is emerging as a promising tool for learning transferable representations across diverse domains.<n>We propose incorporating PDE residuals into pre-training either as the sole learning signal or in combination with data loss to compensate for limited or infeasible training data.<n>Our results show that pre-training with PDE constraints significantly enhances generalization, outperforming models trained solely on solution data.
arXiv Detail & Related papers (2025-03-24T19:12:39Z) - Active Learning for Neural PDE Solvers [18.665448858377694]
Active learning could help surrogate models reach the same accuracy with smaller training sets.<n>We introduce AL4PDE, a modular and active learning benchmark.<n>We show that AL reduces the average error by up to 71% compared to random sampling.
arXiv Detail & Related papers (2024-08-02T18:48:58Z) - Using AI libraries for Incompressible Computational Fluid Dynamics [0.7734726150561089]
We present a novel methodology to bring the power of both AI software and hardware into the field of numerical modelling.
We use the proposed methodology to solve the advection-diffusion equation, the non-linear Burgers equation and incompressible flow past a bluff body.
arXiv Detail & Related papers (2024-02-27T22:00:50Z) - Mixed formulation of physics-informed neural networks for
thermo-mechanically coupled systems and heterogeneous domains [0.0]
Physics-informed neural networks (PINNs) are a new tool for solving boundary value problems.
Recent investigations have shown that when designing loss functions for many engineering problems, using first-order derivatives and combining equations from both strong and weak forms can lead to much better accuracy.
In this work, we propose applying the mixed formulation to solve multi-physical problems, specifically a stationary thermo-mechanically coupled system of equations.
arXiv Detail & Related papers (2023-02-09T21:56:59Z) - AttNS: Attention-Inspired Numerical Solving For Limited Data Scenarios [51.94807626839365]
We propose the attention-inspired numerical solver (AttNS) to solve differential equations due to limited data.<n>AttNS is inspired by the effectiveness of attention modules in Residual Neural Networks (ResNet) in enhancing model generalization and robustness.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - PI-VAE: Physics-Informed Variational Auto-Encoder for stochastic
differential equations [2.741266294612776]
We propose a new class of physics-informed neural networks, called physics-informed Variational Autoencoder (PI-VAE)
PI-VAE consists of a variational autoencoder (VAE), which generates samples of system variables and parameters.
The satisfactory accuracy and efficiency of the proposed method are numerically demonstrated in comparison with physics-informed generative adversarial network (PI-WGAN)
arXiv Detail & Related papers (2022-03-21T21:51:19Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - One-shot learning for solution operators of partial differential equations [3.559034814756831]
Learning and solving governing equations of a physical system, represented by partial differential equations (PDEs), from data is a central challenge in a variety of areas of science and engineering.
Traditional numerical methods for solving PDEs can be computationally expensive for complex systems and require the complete PDEs of the physical system.
Here, we propose the first solution operator learning method that only requires one PDE solution, i.e., one-shot learning.
arXiv Detail & Related papers (2021-04-06T17:35:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.