The Random Feature Model for Input-Output Maps between Banach Spaces
- URL: http://arxiv.org/abs/2005.10224v2
- Date: Sat, 5 Jun 2021 22:47:16 GMT
- Title: The Random Feature Model for Input-Output Maps between Banach Spaces
- Authors: Nicholas H. Nelsen and Andrew M. Stuart
- Abstract summary: The random feature model is a parametric approximation to kernel or regression methods.
We propose a methodology for use of the random feature model as a data-driven surrogate for operators that map an input Banach space to an output Banach space.
- Score: 6.282068591820945
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Well known to the machine learning community, the random feature model is a
parametric approximation to kernel interpolation or regression methods. It is
typically used to approximate functions mapping a finite-dimensional input
space to the real line. In this paper, we instead propose a methodology for use
of the random feature model as a data-driven surrogate for operators that map
an input Banach space to an output Banach space. Although the methodology is
quite general, we consider operators defined by partial differential equations
(PDEs); here, the inputs and outputs are themselves functions, with the input
parameters being functions required to specify the problem, such as initial
data or coefficients, and the outputs being solutions of the problem. Upon
discretization, the model inherits several desirable attributes from this
infinite-dimensional viewpoint, including mesh-invariant approximation error
with respect to the true PDE solution map and the capability to be trained at
one mesh resolution and then deployed at different mesh resolutions. We view
the random feature model as a non-intrusive data-driven emulator, provide a
mathematical framework for its interpretation, and demonstrate its ability to
efficiently and accurately approximate the nonlinear parameter-to-solution maps
of two prototypical PDEs arising in physical science and engineering
applications: viscous Burgers' equation and a variable coefficient elliptic
equation.
Related papers
- Shape-informed surrogate models based on signed distance function domain encoding [8.052704959617207]
We propose a non-intrusive method to build surrogate models that approximate the solution of parameterized partial differential equations (PDEs)
Our approach is based on the combination of two neural networks (NNs)
arXiv Detail & Related papers (2024-09-19T01:47:04Z) - Physics-Informed Quantum Machine Learning: Solving nonlinear
differential equations in latent spaces without costly grid evaluations [21.24186888129542]
We propose a physics-informed quantum algorithm to solve nonlinear and multidimensional differential equations.
By measuring the overlaps between states which are representations of DE terms, we construct a loss that does not require independent sequential function evaluations on grid points.
When the loss is trained variationally, our approach can be related to the differentiable quantum circuit protocol.
arXiv Detail & Related papers (2023-08-03T15:38:31Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Feature Engineering with Regularity Structures [4.082216579462797]
We investigate the use of models from the theory of regularity structures as features in machine learning tasks.
We provide a flexible definition of a model feature vector associated to a space-time signal, along with two algorithms which illustrate ways in which these features can be combined with linear regression.
We apply these algorithms in several numerical experiments designed to learn solutions to PDEs with a given forcing and boundary data.
arXiv Detail & Related papers (2021-08-12T17:53:47Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Identification of Probability weighted ARX models with arbitrary domains [75.91002178647165]
PieceWise Affine models guarantees universal approximation, local linearity and equivalence to other classes of hybrid system.
In this work, we focus on the identification of PieceWise Auto Regressive with eXogenous input models with arbitrary regions (NPWARX)
The architecture is conceived following the Mixture of Expert concept, developed within the machine learning field.
arXiv Detail & Related papers (2020-09-29T12:50:33Z) - Model Reduction and Neural Networks for Parametric PDEs [9.405458160620533]
We develop a framework for data-driven approximation of input-output maps between infinite-dimensional spaces.
The proposed approach is motivated by the recent successes of neural networks and deep learning.
For a class of input-output maps, and suitably chosen probability measures on the inputs, we prove convergence of the proposed approximation methodology.
arXiv Detail & Related papers (2020-05-07T00:09:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.