On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver
- URL: http://arxiv.org/abs/2208.03680v3
- Date: Wed, 20 Sep 2023 01:16:31 GMT
- Title: On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver
- Authors: Zhongzhan Huang, Senwei Liang, Hong Zhang, Haizhao Yang and Liang Lin
- Abstract summary: We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
- Score: 59.13397937903832
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The large-scale simulation of dynamical systems is critical in numerous
scientific and engineering disciplines. However, traditional numerical solvers
are limited by the choice of step sizes when estimating integration, resulting
in a trade-off between accuracy and computational efficiency. To address this
challenge, we introduce a deep learning-based corrector called Neural Vector
(NeurVec), which can compensate for integration errors and enable larger time
step sizes in simulations. Our extensive experiments on a variety of complex
dynamical system benchmarks demonstrate that NeurVec exhibits remarkable
generalization capability on a continuous phase space, even when trained using
limited and discrete data. NeurVec significantly accelerates traditional
solvers, achieving speeds tens to hundreds of times faster while maintaining
high levels of accuracy and stability. Moreover, NeurVec's simple-yet-effective
design, combined with its ease of implementation, has the potential to
establish a new paradigm for fast-solving differential equations based on deep
learning.
Related papers
- Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic
Systems [49.50674348130157]
We propose a new approach to predict the integration based on several velocity estimations with Newton-Cotes formulas.
Experiments on several benchmarks empirically demonstrate consistent and significant improvement compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-05-24T02:23:00Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Physics-informed Deep Super-resolution for Spatiotemporal Data [18.688475686901082]
Deep learning can be used to augment scientific data based on coarse-grained simulations.
We propose a rich and efficient temporal super-resolution framework inspired by physics-informed learning.
Results demonstrate the superior effectiveness and efficiency of the proposed method compared with baseline algorithms.
arXiv Detail & Related papers (2022-08-02T13:57:35Z) - Predicting quantum dynamical cost landscapes with deep learning [0.0]
We introduce deep learning based modelling of the cost functional landscape.
We demonstrate an order of magnitude increases in accuracy and speed over state-of-the-art Bayesian methods.
arXiv Detail & Related papers (2021-06-30T18:00:00Z) - Fast and differentiable simulation of driven quantum systems [58.720142291102135]
We introduce a semi-analytic method based on the Dyson expansion that allows us to time-evolve driven quantum systems much faster than standard numerical methods.
We show results of the optimization of a two-qubit gate using transmon qubits in the circuit QED architecture.
arXiv Detail & Related papers (2020-12-16T21:43:38Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.