Solving High-Dimensional PDEs with Latent Spectral Models
- URL: http://arxiv.org/abs/2301.12664v3
- Date: Mon, 29 May 2023 16:30:47 GMT
- Title: Solving High-Dimensional PDEs with Latent Spectral Models
- Authors: Haixu Wu, Tengge Hu, Huakun Luo, Jianmin Wang, Mingsheng Long
- Abstract summary: We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
- Score: 74.1011309005488
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Deep models have achieved impressive progress in solving partial differential
equations (PDEs). A burgeoning paradigm is learning neural operators to
approximate the input-output mappings of PDEs. While previous deep models have
explored the multiscale architectures and various operator designs, they are
limited to learning the operators as a whole in the coordinate space. In real
physical science problems, PDEs are complex coupled equations with numerical
solvers relying on discretization into high-dimensional coordinate space, which
cannot be precisely approximated by a single operator nor efficiently learned
due to the curse of dimensionality. We present Latent Spectral Models (LSM)
toward an efficient and precise solver for high-dimensional PDEs. Going beyond
the coordinate space, LSM enables an attention-based hierarchical projection
network to reduce the high-dimensional data into a compact latent space in
linear time. Inspired by classical spectral methods in numerical analysis, we
design a neural spectral block to solve PDEs in the latent space that
approximates complex input-output mappings via learning multiple basis
operators, enjoying nice theoretical guarantees for convergence and
approximation. Experimentally, LSM achieves consistent state-of-the-art and
yields a relative gain of 11.5% averaged on seven benchmarks covering both
solid and fluid physics. Code is available at
https://github.com/thuml/Latent-Spectral-Models.
Related papers
- A Deep Learning approach for parametrized and time dependent Partial Differential Equations using Dimensionality Reduction and Neural ODEs [46.685771141109306]
We propose an autoregressive and data-driven method using the analogy with classical numerical solvers for time-dependent, parametric and (typically) nonlinear PDEs.
We show that by leveraging DR we can deliver not only more accurate predictions, but also a considerably lighter and faster Deep Learning model.
arXiv Detail & Related papers (2025-02-12T11:16:15Z) - Transolver++: An Accurate Neural Solver for PDEs on Million-Scale Geometries [67.63077028746191]
Transolver++ is a highly parallel and efficient neural solver that can solve PDEs on million-scale geometries.
Transolver++ increases the single- GPU input capacity to million-scale points for the first time.
It achieves over 20% performance gain in million-scale high-fidelity industrial simulations.
arXiv Detail & Related papers (2025-02-04T15:33:50Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Separable DeepONet: Breaking the Curse of Dimensionality in Physics-Informed Machine Learning [0.0]
In the absence of labeled datasets, we utilize the PDE residual loss to learn the physical system, an approach known as physics-informed DeepONet.
This method faces significant computational challenges, primarily due to the curse of dimensionality, as the computational cost increases exponentially with finer discretization.
We introduce the Separable DeepONet framework to address these challenges and improve scalability for high-dimensional PDEs.
arXiv Detail & Related papers (2024-07-21T16:33:56Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Approximation of Solution Operators for High-dimensional PDEs [2.3076986663832044]
We propose a finite-dimensional control-based method to approximate solution operators for evolutional partial differential equations.
Results are presented for several high-dimensional PDEs, including real-world applications to solving Hamilton-Jacobi-Bellman equations.
arXiv Detail & Related papers (2024-01-18T21:45:09Z) - Spectral operator learning for parametric PDEs without data reliance [6.7083321695379885]
We introduce a novel operator learning-based approach for solving parametric partial differential equations (PDEs) without the need for data harnessing.
The proposed framework demonstrates superior performance compared to existing scientific machine learning techniques.
arXiv Detail & Related papers (2023-10-03T12:37:15Z) - Learning in latent spaces improves the predictive accuracy of deep
neural operators [0.0]
L-DeepONet is an extension of standard DeepONet, which leverages latent representations of high-dimensional PDE input and output functions identified with suitable autoencoders.
We show that L-DeepONet outperforms the standard approach in terms of both accuracy and computational efficiency across diverse time-dependent PDEs.
arXiv Detail & Related papers (2023-04-15T17:13:09Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.