Wind Field Reconstruction with Adaptive Random Fourier Features
- URL: http://arxiv.org/abs/2102.02365v1
- Date: Thu, 4 Feb 2021 01:42:08 GMT
- Title: Wind Field Reconstruction with Adaptive Random Fourier Features
- Authors: Jonas Kiessling, Emanuel Str\"om and Ra\'ul Tempone
- Abstract summary: We investigate the use of spatial methods for reconstructing the horizontal near-surface wind field given a sparse set of measurements.
We include a physically motivated divergence penalty term $|nabla cdot beta(pmb x)|2$, as well as a penalty on the Sobolev norm.
We devise an adaptive-Hastings algorithm for sampling the frequencies of the optimal distribution.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate the use of spatial interpolation methods for reconstructing
the horizontal near-surface wind field given a sparse set of measurements. In
particular, random Fourier features is compared to a set of benchmark methods
including Kriging and Inverse distance weighting. Random Fourier features is a
linear model $\beta(\pmb x) = \sum_{k=1}^K \beta_k e^{i\omega_k \pmb x}$
approximating the velocity field, with frequencies $\omega_k$ randomly sampled
and amplitudes $\beta_k$ trained to minimize a loss function. We include a
physically motivated divergence penalty term $|\nabla \cdot \beta(\pmb x)|^2$,
as well as a penalty on the Sobolev norm. We derive a bound on the
generalization error and derive a sampling density that minimizes the bound.
Following (arXiv:2007.10683 [math.NA]), we devise an adaptive
Metropolis-Hastings algorithm for sampling the frequencies of the optimal
distribution. In our experiments, our random Fourier features model outperforms
the benchmark models.
Related papers
- Outsourced diffusion sampling: Efficient posterior inference in latent spaces of generative models [65.71506381302815]
We propose amortize the cost of sampling from a posterior distribution of the form $p(mathbfxmidmathbfy) propto p_theta(mathbfx)$.
For many models and constraints of interest, the posterior in the noise space is smoother than the posterior in the data space, making it more amenable to such amortized inference.
arXiv Detail & Related papers (2025-02-10T19:49:54Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Model-adapted Fourier sampling for generative compressed sensing [7.130302992490975]
We study generative compressed sensing when the measurement matrix is randomly subsampled from a unitary matrix.
We construct a model-adapted sampling strategy with an improved sample complexity of $textitO(kd| boldsymbolalpha|_22)$ measurements.
arXiv Detail & Related papers (2023-10-08T03:13:16Z) - Stochastic optimal transport in Banach Spaces for regularized estimation
of multivariate quantiles [0.0]
We introduce a new algorithm for solving entropic optimal transport (EOT) between two absolutely continuous probability measures $mu$ and $nu$.
We study the almost sure convergence of our algorithm that takes its values in an infinite-dimensional Banach space.
arXiv Detail & Related papers (2023-02-02T10:02:01Z) - Approximate Function Evaluation via Multi-Armed Bandits [51.146684847667125]
We study the problem of estimating the value of a known smooth function $f$ at an unknown point $boldsymbolmu in mathbbRn$, where each component $mu_i$ can be sampled via a noisy oracle.
We design an instance-adaptive algorithm that learns to sample according to the importance of each coordinate, and with probability at least $1-delta$ returns an $epsilon$ accurate estimate of $f(boldsymbolmu)$.
arXiv Detail & Related papers (2022-03-18T18:50:52Z) - Random quantum circuits transform local noise into global white noise [118.18170052022323]
We study the distribution over measurement outcomes of noisy random quantum circuits in the low-fidelity regime.
For local noise that is sufficiently weak and unital, correlations (measured by the linear cross-entropy benchmark) between the output distribution $p_textnoisy$ of a generic noisy circuit instance shrink exponentially.
If the noise is incoherent, the output distribution approaches the uniform distribution $p_textunif$ at precisely the same rate.
arXiv Detail & Related papers (2021-11-29T19:26:28Z) - Sharp Analysis of Random Fourier Features in Classification [9.383533125404755]
We show for the first time that random Fourier features classification can achieve $O(sqrtn)$ learning rate with only $Omega(sqrtn log n)$ features.
arXiv Detail & Related papers (2021-09-22T09:49:27Z) - Optimal Robust Linear Regression in Nearly Linear Time [97.11565882347772]
We study the problem of high-dimensional robust linear regression where a learner is given access to $n$ samples from the generative model $Y = langle X,w* rangle + epsilon$
We propose estimators for this problem under two settings: (i) $X$ is L4-L2 hypercontractive, $mathbbE [XXtop]$ has bounded condition number and $epsilon$ has bounded variance and (ii) $X$ is sub-Gaussian with identity second moment and $epsilon$ is
arXiv Detail & Related papers (2020-07-16T06:44:44Z) - Tight Nonparametric Convergence Rates for Stochastic Gradient Descent
under the Noiseless Linear Model [0.0]
We analyze the convergence of single-pass, fixed step-size gradient descent on the least-square risk under this model.
As a special case, we analyze an online algorithm for estimating a real function on the unit interval from the noiseless observation of its value at randomly sampled points.
arXiv Detail & Related papers (2020-06-15T08:25:50Z) - Non-Adaptive Adaptive Sampling on Turnstile Streams [57.619901304728366]
We give the first relative-error algorithms for column subset selection, subspace approximation, projective clustering, and volume on turnstile streams that use space sublinear in $n$.
Our adaptive sampling procedure has a number of applications to various data summarization problems that either improve state-of-the-art or have only been previously studied in the more relaxed row-arrival model.
arXiv Detail & Related papers (2020-04-23T05:00:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.