Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems
- URL: http://arxiv.org/abs/2405.07097v2
- Date: Sun, 15 Dec 2024 19:04:32 GMT
- Title: Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems
- Authors: Katsiaryna Haitsiukevich, Onur Poyraz, Pekka Marttinen, Alexander Ilin,
- Abstract summary: We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
- Score: 49.2319247825857
- License:
- Abstract: This paper explores the efficacy of diffusion-based generative models as neural operators for partial differential equations (PDEs). Neural operators are neural networks that learn a mapping from the parameter space to the solution space of PDEs from data, and they can also solve the inverse problem of estimating the parameter from the solution. Diffusion models excel in many domains, but their potential as neural operators has not been thoroughly explored. In this work, we show that diffusion-based generative models exhibit many properties favourable for neural operators, and they can effectively generate the solution of a PDE conditionally on the parameter or recover the unobserved parts of the system. We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training. In our experiments with multiple realistic dynamical systems, diffusion models outperform other neural operators. Furthermore, we demonstrate how the probabilistic diffusion model can elegantly deal with systems which are only partially identifiable, by producing samples corresponding to the different possible solutions.
Related papers
- Probabilistic neural operators for functional uncertainty quantification [14.08907045605149]
We introduce the probabilistic neural operator (PNO), a framework for learning probability distributions over the output function space of neural operators.
PNO extends neural operators with generative modeling based on strictly proper scoring rules, integrating uncertainty information directly into the training process.
arXiv Detail & Related papers (2025-02-18T14:42:11Z) - Generative Modeling of Neural Dynamics via Latent Stochastic Differential Equations [1.5467259918426441]
We propose a framework for developing computational models of biological neural systems.
We employ a system of coupled differential equations with differentiable drift and diffusion functions.
We show that these hybrid models achieve competitive performance in predicting stimulus-evoked neural and behavioral responses.
arXiv Detail & Related papers (2024-12-01T09:36:03Z) - Diffeomorphic Latent Neural Operators for Data-Efficient Learning of Solutions to Partial Differential Equations [5.308435208832696]
A computed approximation of the solution operator to a system of partial differential equations (PDEs) is needed in various areas of science and engineering.
We propose that in order to learn a PDE solution operator that can generalize across multiple domains without needing to sample enough data expressive enough, we can train instead a latent neural operator on just a few ground truth solution fields.
arXiv Detail & Related papers (2024-11-27T03:16:00Z) - Latent Neural PDE Solver: a reduced-order modelling framework for partial differential equations [5.949599220326208]
We propose to learn the dynamics of the system in the latent space with much coarser discretizations.
A non-linear autoencoder is first trained to project the full-order representation of the system onto the mesh-reduced space.
We showcase that it has competitive accuracy and efficiency compared to the neural PDE solver that operates on full-order space.
arXiv Detail & Related papers (2024-02-27T19:36:27Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.
This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Investigating Neuron Disturbing in Fusing Heterogeneous Neural Networks [6.389882065284252]
In this paper, we reveal the phenomenon of neuron disturbing, where neurons from heterogeneous local models interfere with each other mutually.
We propose an experimental method that excludes neuron disturbing and fuses neural networks via adaptively selecting a local model, called AMS, to execute the prediction.
arXiv Detail & Related papers (2022-10-24T06:47:48Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.