CROM: Continuous Reduced-Order Modeling of PDEs Using Implicit Neural
Representations
- URL: http://arxiv.org/abs/2206.02607v1
- Date: Mon, 6 Jun 2022 13:27:21 GMT
- Title: CROM: Continuous Reduced-Order Modeling of PDEs Using Implicit Neural
Representations
- Authors: Peter Yichen Chen, Jinxu Xiang, Dong Heon Cho, G A Pershing, Henrique
Teles Maia, Maurizio Chiaramonte, Kevin Carlberg, Eitan Grinspun
- Abstract summary: Excessive runtime of high-fidelity partial differential equation solvers makes them unsuitable for time-critical applications.
We propose to accelerate PDE solvers using reduced-order modeling (ROM)
Our approach builds a smooth, low-dimensional manifold of the continuous vector fields themselves, not their discretization.
- Score: 5.551136447769071
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The excessive runtime of high-fidelity partial differential equation (PDE)
solvers makes them unsuitable for time-critical applications. We propose to
accelerate PDE solvers using reduced-order modeling (ROM). Whereas prior ROM
approaches reduce the dimensionality of discretized vector fields, our
continuous reduced-order modeling (CROM) approach builds a smooth,
low-dimensional manifold of the continuous vector fields themselves, not their
discretization. We represent this reduced manifold using neural fields, relying
on their continuous and differentiable nature to efficiently solve the PDEs.
CROM may train on any and all available numerical solutions of the continuous
system, even when they are obtained using diverse methods or discretizations.
After the low-dimensional manifolds are built, solving PDEs requires
significantly less computational resources. Since CROM is
discretization-agnostic, CROM-based PDE solvers may optimally adapt
discretization resolution over time to economize computation. We validate our
approach on an extensive range of PDEs with training data from voxel grids,
meshes, and point clouds. Large-scale experiments demonstrate that our approach
obtains speed, memory, and accuracy advantages over prior ROM approaches while
gaining 109$\times$ wall-clock speedup over full-order models on CPUs and
89$\times$ speedup on GPUs.
Related papers
- Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Elucidating the solution space of extended reverse-time SDE for
diffusion models [54.23536653351234]
Diffusion models (DMs) demonstrate potent image generation capabilities in various generative modeling tasks.
Their primary limitation lies in slow sampling speed, requiring hundreds or thousands of sequential function evaluations to generate high-quality images.
We formulate the sampling process as an extended reverse-time SDE, unifying prior explorations into ODEs and SDEs.
We devise fast and training-free samplers, ER-SDE-rs, achieving state-of-the-art performance across all samplers.
arXiv Detail & Related papers (2023-09-12T12:27:17Z) - GPLaSDI: Gaussian Process-based Interpretable Latent Space Dynamics Identification through Deep Autoencoder [0.0]
We introduce a novel La Gaussian-based framework that relies on latent space ODEs.
We demonstrate the effectiveness of our approach on the Burgers equation, the Vlasov equation for plasma physics, and a rising thermal bubble problem.
Our proposed method achieves between 200 and 100,000 times speed-up, with up to 7% up relative error.
arXiv Detail & Related papers (2023-08-10T23:54:12Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Meta-Auto-Decoder for Solving Parametric Partial Differential Equations [32.46080264991759]
Partial Differential Equations (PDEs) are ubiquitous in many disciplines of science and engineering and notoriously difficult to solve.
Our proposed approach, called Meta-Auto-Decoder (MAD), treats solving parametric PDEs as a meta-learning problem.
MAD exhibits faster convergence speed without losing the accuracy compared with other deep learning methods.
arXiv Detail & Related papers (2021-11-15T02:51:42Z) - Semi-Implicit Neural Solver for Time-dependent Partial Differential
Equations [4.246966726709308]
We propose a neural solver to learn an optimal iterative scheme in a data-driven fashion for any class of PDEs.
We provide theoretical guarantees for the correctness and convergence of neural solvers analogous to conventional iterative solvers.
arXiv Detail & Related papers (2021-09-03T12:03:10Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z) - DiscretizationNet: A Machine-Learning based solver for Navier-Stokes
Equations using Finite Volume Discretization [0.7366405857677226]
The goal of this work is to develop an ML-based PDE solver, that couples important characteristics of existing PDE solvers with Machine Learning technologies.
Our ML-solver, DiscretizationNet, employs a generative CNN-based encoder-decoder model with PDE variables as both input and output features.
A novel iterative capability is implemented during the network training to improve the stability and convergence of the ML-solver.
arXiv Detail & Related papers (2020-05-17T19:54:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.