Towards Coordinate- and Dimension-Agnostic Machine Learning for Partial Differential Equations
- URL: http://arxiv.org/abs/2505.16549v1
- Date: Thu, 22 May 2025 11:37:55 GMT
- Title: Towards Coordinate- and Dimension-Agnostic Machine Learning for Partial Differential Equations
- Authors: Trung V. Phan, George A. Kevrekidis, Soledad Villar, Yannis G. Kevrekidis, Juan M. Bello-Rivas,
- Abstract summary: We employ a machine learning approach to predict the evolution of scalar field systems expressed in the formalism of exterior calculus.<n>We show that the field dynamics learned in one space can be used to make accurate predictions in other spaces with different dimensions, coordinate systems, boundary conditions, and curvatures.
- Score: 5.371028888134542
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The machine learning methods for data-driven identification of partial differential equations (PDEs) are typically defined for a given number of spatial dimensions and a choice of coordinates the data have been collected in. This dependence prevents the learned evolution equation from generalizing to other spaces. In this work, we reformulate the problem in terms of coordinate- and dimension-independent representations, paving the way toward what we call ``spatially liberated" PDE learning. To this end, we employ a machine learning approach to predict the evolution of scalar field systems expressed in the formalism of exterior calculus, which is coordinate-free and immediately generalizes to arbitrary dimensions by construction. We demonstrate the performance of this approach in the FitzHugh-Nagumo and Barkley reaction-diffusion models, as well as the Patlak-Keller-Segel model informed by in-situ chemotactic bacteria observations. We provide extensive numerical experiments that demonstrate that our approach allows for seamless transitions across various spatial contexts. We show that the field dynamics learned in one space can be used to make accurate predictions in other spaces with different dimensions, coordinate systems, boundary conditions, and curvatures.
Related papers
- Governing Equation Discovery from Data Based on Differential Invariants [52.2614860099811]
We propose a pipeline for governing equation discovery based on differential invariants.<n>Specifically, we compute the set of differential invariants corresponding to the infinitesimal generators of the symmetry group.<n>Taking DI-SINDy as an example, we demonstrate that its success rate and accuracy in PDE discovery surpass those of other symmetry-informed governing equation discovery methods.
arXiv Detail & Related papers (2025-05-24T17:19:02Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - TaylorPDENet: Learning PDEs from non-grid Data [2.0550543745258283]
TaylorPDENet is a novel machine learning method that is designed to overcome this challenge.
Our algorithm uses the multidimensional Taylor expansion of a dynamical system at each observation point to estimate the spatial derivatives to perform predictions.
We evaluate our model on a variety of advection-diffusion equations with different parameters and show that it performs similarly to equivalent approaches on grid-structured data.
arXiv Detail & Related papers (2023-06-26T08:40:24Z) - Autoencoders for discovering manifold dimension and coordinates in data
from complex dynamical systems [0.0]
Autoencoder framework combines implicit regularization with internal linear layers and $L$ regularization (weight decay)
We show that this framework can be naturally extended for applications of state-space modeling and forecasting.
arXiv Detail & Related papers (2023-05-01T21:14:47Z) - Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces II: non-compact symmetric spaces [43.877478563933316]
In to symmetries is one of the most fundamental forms of prior information one can consider.
In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces.
arXiv Detail & Related papers (2023-01-30T17:27:12Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces I: the compact case [43.877478563933316]
In to symmetries is one of the most fundamental forms of prior information one can consider.
In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces.
arXiv Detail & Related papers (2022-08-31T16:40:40Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z) - On the Kullback-Leibler divergence between pairwise isotropic
Gaussian-Markov random fields [93.35534658875731]
We derive expressions for the Kullback-Leibler divergence between two pairwise isotropic Gaussian-Markov random fields.
The proposed equation allows the development of novel similarity measures in image processing and machine learning applications.
arXiv Detail & Related papers (2022-03-24T16:37:24Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Data-Driven Reduced-Order Modeling of Spatiotemporal Chaos with Neural
Ordinary Differential Equations [0.0]
We present a data-driven reduced order modeling method that capitalizes on the chaotic dynamics of partial differential equations.
We find that dimension reduction improves performance relative to predictions in the ambient space.
With the low-dimensional model, we find excellent short- and long-time statistical recreation of the true dynamics for widely spaced data.
arXiv Detail & Related papers (2021-08-31T20:00:33Z) - Feature Engineering with Regularity Structures [4.082216579462797]
We investigate the use of models from the theory of regularity structures as features in machine learning tasks.
We provide a flexible definition of a model feature vector associated to a space-time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression.
We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data.
arXiv Detail & Related papers (2021-08-12T17:53:47Z) - Learning emergent PDEs in a learned emergent space [0.6157382820537719]
We learn predictive models in the form of partial differential equations (PDEs) for the collective description of a coupled-agent system.
We show that the collective dynamics on a slow manifold can be approximated through a learned model based on local "spatial" partial derivatives in the emergent coordinates.
The proposed approach thus integrates the automatic, data-driven extraction of emergent space coordinates parametrizing the agent dynamics.
arXiv Detail & Related papers (2020-12-23T15:17:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.