Solving high-dimensional parabolic PDEs using the tensor train format
- URL: http://arxiv.org/abs/2102.11830v1
- Date: Tue, 23 Feb 2021 18:04:00 GMT
- Title: Solving high-dimensional parabolic PDEs using the tensor train format
- Authors: Lorenz Richter, Leon Sallandt, Nikolas N\"usken
- Abstract summary: We argue that tensor trains provide an appealing approximation framework for parabolic PDEs.
We develop novel iterative schemes, involving either explicit and fast or implicit and accurate updates.
- Score: 1.1470070927586016
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: High-dimensional partial differential equations (PDEs) are ubiquitous in
economics, science and engineering. However, their numerical treatment poses
formidable challenges since traditional grid-based methods tend to be
frustrated by the curse of dimensionality. In this paper, we argue that tensor
trains provide an appealing approximation framework for parabolic PDEs: the
combination of reformulations in terms of backward stochastic differential
equations and regression-type methods in the tensor format holds the promise of
leveraging latent low-rank structures enabling both compression and efficient
computation. Following this paradigm, we develop novel iterative schemes,
involving either explicit and fast or implicit and accurate updates. We
demonstrate in a number of examples that our methods achieve a favorable
trade-off between accuracy and computational efficiency in comparison with
state-of-the-art neural network based approaches.
Related papers
- A Deep Learning approach for parametrized and time dependent Partial Differential Equations using Dimensionality Reduction and Neural ODEs [46.685771141109306]
We propose an autoregressive and data-driven method using the analogy with classical numerical solvers for time-dependent, parametric and (typically) nonlinear PDEs.
We show that by leveraging DR we can deliver not only more accurate predictions, but also a considerably lighter and faster Deep Learning model.
arXiv Detail & Related papers (2025-02-12T11:16:15Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Partial-differential-algebraic equations of nonlinear dynamics by Physics-Informed Neural-Network: (I) Operator splitting and framework assessment [51.3422222472898]
Several forms for constructing novel physics-informed-networks (PINN) for the solution of partial-differential-algebraic equations are proposed.
Among these novel methods are the PDE forms, which evolve from the lower-level form with fewer unknown dependent variables to higher-level form with more dependent variables.
arXiv Detail & Related papers (2024-07-13T22:48:17Z) - Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - From continuous-time formulations to discretization schemes: tensor
trains and robust regression for BSDEs and parabolic PDEs [3.785123406103385]
We argue that tensor trains provide an appealing framework for parabolic PDEs.
We develop iterative schemes, which differ in terms of computational efficiency and robustness.
We demonstrate both theoretically and numerically that our methods can achieve a favorable trade-off between accuracy and computational efficiency.
arXiv Detail & Related papers (2023-07-28T11:44:06Z) - Robust SDE-Based Variational Formulations for Solving Linear PDEs via
Deep Learning [6.1678491628787455]
Combination of Monte Carlo methods and deep learning has led to efficient algorithms for solving partial differential equations (PDEs) in high dimensions.
Related learning problems are often stated as variational formulations based on associated differential equations (SDEs)
It is therefore crucial to rely on adequate gradient estimators that exhibit low variance in order to reach convergence accurately and swiftly.
arXiv Detail & Related papers (2022-06-21T17:59:39Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Approximate Latent Force Model Inference [1.3927943269211591]
latent force models offer an interpretable alternative to purely data driven tools for inference in dynamical systems.
We show that a neural operator approach can scale our model to thousands of instances, enabling fast, distributed computation.
arXiv Detail & Related papers (2021-09-24T09:55:00Z) - Hybrid FEM-NN models: Combining artificial neural networks with the
finite element method [0.0]
We present a methodology combining neural networks with physical principle constraints in the form of partial differential equations (PDEs)
The approach allows to train neural networks while respecting the PDEs as a strong constraint in the optimisation as apposed to making them part of the loss function.
We demonstrate the method on a complex cardiac cell model problem using deep neural networks.
arXiv Detail & Related papers (2021-01-04T13:36:06Z) - Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs [71.26657499537366]
We propose a simple literature-based method for the efficient approximation of gradients in neural ODE models.
We compare it with the reverse dynamic method to train neural ODEs on classification, density estimation, and inference approximation tasks.
arXiv Detail & Related papers (2020-03-11T13:15:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.