Equivariant Neural Simulators for Stochastic Spatiotemporal Dynamics
- URL: http://arxiv.org/abs/2305.14286v2
- Date: Mon, 30 Oct 2023 09:00:55 GMT
- Title: Equivariant Neural Simulators for Stochastic Spatiotemporal Dynamics
- Authors: Koen Minartz, Yoeri Poels, Simon Koop, Vlado Menkovski
- Abstract summary: Equi Probabilistic Neural Simulation (EPNS) is a framework for autoregressive modeling of equivariant distributions over system evolutions.
EPNS considerably outperforms existing neural network-based methods for probabilistic simulation.
- Score: 4.271235935891555
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks are emerging as a tool for scalable data-driven simulation of
high-dimensional dynamical systems, especially in settings where numerical
methods are infeasible or computationally expensive. Notably, it has been shown
that incorporating domain symmetries in deterministic neural simulators can
substantially improve their accuracy, sample efficiency, and parameter
efficiency. However, to incorporate symmetries in probabilistic neural
simulators that can simulate stochastic phenomena, we need a model that
produces equivariant distributions over trajectories, rather than equivariant
function approximations. In this paper, we propose Equivariant Probabilistic
Neural Simulation (EPNS), a framework for autoregressive probabilistic modeling
of equivariant distributions over system evolutions. We use EPNS to design
models for a stochastic n-body system and stochastic cellular dynamics. Our
results show that EPNS considerably outperforms existing neural network-based
methods for probabilistic simulation. More specifically, we demonstrate that
incorporating equivariance in EPNS improves simulation quality, data
efficiency, rollout stability, and uncertainty quantification. We conclude that
EPNS is a promising method for efficient and effective data-driven
probabilistic simulation in a diverse range of domains.
Related papers
- Efficient modeling of sub-kilometer surface wind with Gaussian processes and neural networks [0.0]
Wind represents a particularly challenging variable to model due to its high spatial and temporal variability.
This paper presents a novel approach that integrates Gaussian processes (GPs) and neural networks to model surface wind gusts.
We discuss the effect of different modeling choices, as well as different degrees of approximation, and present our results for a case study.
arXiv Detail & Related papers (2024-05-21T09:07:47Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.
We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - A Multi-Grained Symmetric Differential Equation Model for Learning
Protein-Ligand Binding Dynamics [74.93549765488103]
In drug discovery, molecular dynamics simulation provides a powerful tool for predicting binding affinities, estimating transport properties, and exploring pocket sites.
We propose NeuralMD, the first machine learning surrogate that can facilitate numerical MD and provide accurate simulations in protein-ligand binding.
We show the efficiency and effectiveness of NeuralMD, with a 2000$times$ speedup over standard numerical MD simulation and outperforming all other ML approaches by up to 80% under the stability metric.
arXiv Detail & Related papers (2024-01-26T09:35:17Z) - DiffHybrid-UQ: Uncertainty Quantification for Differentiable Hybrid
Neural Modeling [4.76185521514135]
We introduce a novel method, DiffHybrid-UQ, for effective and efficient uncertainty propagation and estimation in hybrid neural differentiable models.
Specifically, our approach effectively discerns and quantifies both aleatoric uncertainties, arising from data noise, and epistemic uncertainties, resulting from model-form discrepancies and data sparsity.
arXiv Detail & Related papers (2023-12-30T07:40:47Z) - Adaptive Conditional Quantile Neural Processes [9.066817971329899]
Conditional Quantile Neural Processes (CQNPs) are a new member of the neural processes family.
We introduce an extension of quantile regression where the model learns to focus on estimating informative quantiles.
Experiments with real and synthetic datasets demonstrate substantial improvements in predictive performance.
arXiv Detail & Related papers (2023-05-30T06:19:19Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - Nonparametric likelihood-free inference with Jensen-Shannon divergence
for simulator-based models with categorical output [1.4298334143083322]
Likelihood-free inference for simulator-based statistical models has attracted a surge of interest, both in the machine learning and statistics communities.
Here we derive a set of theoretical results to enable estimation, hypothesis testing and construction of confidence intervals for model parameters using computation properties of the Jensen-Shannon- divergence.
Such approximation offers a rapid alternative to more-intensive approaches and can be attractive for diverse applications of simulator-based models.
arXiv Detail & Related papers (2022-05-22T18:00:13Z) - Learning Stochastic Dynamics with Statistics-Informed Neural Network [0.4297070083645049]
We introduce a machine-learning framework named statistics-informed neural network (SINN) for learning dynamics from data.
We devise mechanisms for training the neural network model to reproduce the correct emphstatistical behavior of a target process.
We show that the obtained reduced-order model can be trained on temporally coarse-grained data and hence is well suited for rare-event simulations.
arXiv Detail & Related papers (2022-02-24T18:21:01Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.