FNOPE: Simulation-based inference on function spaces with Fourier Neural Operators
- URL: http://arxiv.org/abs/2505.22573v1
- Date: Wed, 28 May 2025 16:46:56 GMT
- Title: FNOPE: Simulation-based inference on function spaces with Fourier Neural Operators
- Authors: Guy Moss, Leah Sophie Muhle, Reinhard Drews, Jakob H. Macke, Cornelius Schröder,
- Abstract summary: We introduce an efficient approach for posterior estimation using a Fourier Operator (FNO) architecture with a flow matching objective.<n>FNOPE can perform inference of function-valued parameters at a fraction of the simulation budget of state of the art methods.<n>FNOPE extends the applicability of SBI methods to new scientific domains by enabling the inference of function-valued parameters.
- Score: 5.673617376471343
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Simulation-based inference (SBI) is an established approach for performing Bayesian inference on scientific simulators. SBI so far works best on low-dimensional parametric models. However, it is difficult to infer function-valued parameters, which frequently occur in disciplines that model spatiotemporal processes such as the climate and earth sciences. Here, we introduce an approach for efficient posterior estimation, using a Fourier Neural Operator (FNO) architecture with a flow matching objective. We show that our approach, FNOPE, can perform inference of function-valued parameters at a fraction of the simulation budget of state of the art methods. In addition, FNOPE supports posterior evaluation at arbitrary discretizations of the domain, as well as simultaneous estimation of vector-valued parameters. We demonstrate the effectiveness of our approach on several benchmark tasks and a challenging spatial inference task from glaciology. FNOPE extends the applicability of SBI methods to new scientific domains by enabling the inference of function-valued parameters.
Related papers
- ConDiSim: Conditional Diffusion Models for Simulation Based Inference [0.3749861135832073]
ConDiSim is a conditional diffusion model for simulation-based inference of complex systems with intractable likelihoods.<n>It is evaluated across ten benchmark problems and two real-world test problems, where it demonstrates effective posterior approximation accuracy.
arXiv Detail & Related papers (2025-05-13T09:58:23Z) - Mixture of neural operator experts for learning boundary conditions and model selection [0.40964539027092917]
We introduce an alternative approach to imposing boundary conditions inspired by volume penalization from numerical methods.<n>By introducing competing experts, the approach additionally allows for model selection.
arXiv Detail & Related papers (2025-02-06T23:29:32Z) - Efficient Learning of POMDPs with Known Observation Model in Average-Reward Setting [56.92178753201331]
We propose the Observation-Aware Spectral (OAS) estimation technique, which enables the POMDP parameters to be learned from samples collected using a belief-based policy.
We show the consistency of the OAS procedure, and we prove a regret guarantee of order $mathcalO(sqrtT log(T)$ for the proposed OAS-UCRL algorithm.
arXiv Detail & Related papers (2024-10-02T08:46:34Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Simulation-based inference using surjective sequential neural likelihood
estimation [50.24983453990065]
Surjective Sequential Neural Likelihood estimation is a novel method for simulation-based inference.
By embedding the data in a low-dimensional space, SSNL solves several issues previous likelihood-based methods had when applied to high-dimensional data sets.
arXiv Detail & Related papers (2023-08-02T10:02:38Z) - Spherical Fourier Neural Operators: Learning Stable Dynamics on the
Sphere [53.63505583883769]
We introduce Spherical FNOs (SFNOs) for learning operators on spherical geometries.
SFNOs have important implications for machine learning-based simulation of climate dynamics.
arXiv Detail & Related papers (2023-06-06T16:27:17Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - Arbitrary Marginal Neural Ratio Estimation for Simulation-based
Inference [7.888755225607877]
We present a novel method that enables amortized inference over arbitrary subsets of the parameters, without resorting to numerical integration.
We demonstrate the applicability of the method on parameter inference of binary black hole systems from gravitational waves observations.
arXiv Detail & Related papers (2021-10-01T14:35:46Z) - Truncated Marginal Neural Ratio Estimation [5.438798591410838]
We present a neural simulator-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability.
Our approach is simulation efficient by simultaneously estimating low-dimensional marginal posteriors instead of the joint posterior.
By estimating a locally amortized posterior our algorithm enables efficient empirical tests of the robustness of the inference results.
arXiv Detail & Related papers (2021-07-02T18:00:03Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Misspecification-robust likelihood-free inference in high dimensions [13.934999364767918]
We introduce an extension of the popular Bayesian optimisation based approach to approximate discrepancy functions in a probabilistic manner.<n>Our approach achieves computational scalability for higher dimensional parameter spaces by using separate acquisition functions and discrepancies for each parameter.<n>The method successfully performs computationally efficient inference in a 100-dimensional space on canonical examples and compares favourably to existing modularised ABC methods.
arXiv Detail & Related papers (2020-02-21T16:06:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.