Latent Mamba Operator for Partial Differential Equations
- URL: http://arxiv.org/abs/2505.19105v2
- Date: Wed, 28 May 2025 07:11:21 GMT
- Title: Latent Mamba Operator for Partial Differential Equations
- Authors: Karn Tiwari, Niladri Dutta, N M Anoop Krishnan, Prathosh A P,
- Abstract summary: We introduce the Latent Mamba Operator (LaMO), which integrates the efficiency of state-space models (SSMs) in latent space with the expressive power of kernel integral formulations in neural operators.<n>LaMOs achieve consistent state-of-the-art (SOTA) performance, with a 32.3% improvement over existing baselines in solution operator approximation.
- Score: 8.410938527671341
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural operators have emerged as powerful data-driven frameworks for solving Partial Differential Equations (PDEs), offering significant speedups over numerical methods. However, existing neural operators struggle with scalability in high-dimensional spaces, incur high computational costs, and face challenges in capturing continuous and long-range dependencies in PDE dynamics. To address these limitations, we introduce the Latent Mamba Operator (LaMO), which integrates the efficiency of state-space models (SSMs) in latent space with the expressive power of kernel integral formulations in neural operators. We also establish a theoretical connection between state-space models (SSMs) and the kernel integral of neural operators. Extensive experiments across diverse PDE benchmarks on regular grids, structured meshes, and point clouds covering solid and fluid physics datasets, LaMOs achieve consistent state-of-the-art (SOTA) performance, with a 32.3% improvement over existing baselines in solution operator approximation, highlighting its efficacy in modeling complex PDE solutions.
Related papers
- Merging Memory and Space: A Spatiotemporal State Space Neural Operator [2.0104149319910767]
ST-SSM is a compact architecture for learning solution operators of time-dependent partial differential equations.<n>A theoretical connection is established between s and neural operators, and a unified theorem is proved for the resulting class of architectures.<n>Our results highlight the advantages of dimensionally factorized operator learning for efficient and general PDE modeling.
arXiv Detail & Related papers (2025-07-31T11:09:15Z) - RONOM: Reduced-Order Neural Operator Modeling [1.2016264781280588]
This work introduces the reduced-order neural operator modeling (RONOM) framework, which bridges concepts from ROM and operator learning.<n>We establish a discretization error bound analogous to those in ROM, and get insights into RONOM's discretization convergence and discretization robustness.
arXiv Detail & Related papers (2025-07-17T06:14:19Z) - Enabling Local Neural Operators to perform Equation-Free System-Level Analysis [0.0]
Neural Operators (NOs) provide a powerful framework for computations involving physical laws.<n>We propose and implement a framework that integrates (local) NOs with advanced iterative numerical methods in the Krylov subspace.<n>We illustrate our framework via three nonlinear PDE benchmarks.
arXiv Detail & Related papers (2025-05-05T01:17:18Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Mamba Neural Operator: Who Wins? Transformers vs. State-Space Models for PDEs [14.14673083512826]
Partial differential equations (PDEs) are widely used to model complex physical systems.<n>Transformers have emerged as the preferred architecture for PDEs due to their ability to capture intricate dependencies.<n>We introduce the Mamba Neural Operator (MNO), a novel framework that enhances neural operator-based techniques for solving PDEs.
arXiv Detail & Related papers (2024-10-03T00:32:31Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - LordNet: An Efficient Neural Network for Learning to Solve Parametric Partial Differential Equations without Simulated Data [47.49194807524502]
We propose LordNet, a tunable and efficient neural network for modeling entanglements.
The experiments on solving Poisson's equation and (2D and 3D) Navier-Stokes equation demonstrate that the long-range entanglements can be well modeled by the LordNet.
arXiv Detail & Related papers (2022-06-19T14:41:08Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.