Large-scale magnetic field maps using structured kernel interpolation
for Gaussian process regression
- URL: http://arxiv.org/abs/2310.16574v1
- Date: Wed, 25 Oct 2023 11:58:18 GMT
- Title: Large-scale magnetic field maps using structured kernel interpolation
for Gaussian process regression
- Authors: Clara Menzen and Marnix Fetter and Manon Kok
- Abstract summary: We present a mapping algorithm to compute large-scale magnetic field maps in indoor environments.
In our simulations, we show that our method achieves better accuracy than current state-of-the-art methods on magnetic field maps with a growing mapping area.
- Score: 0.9208007322096532
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a mapping algorithm to compute large-scale magnetic field maps in
indoor environments with approximate Gaussian process (GP) regression. Mapping
the spatial variations in the ambient magnetic field can be used for
localization algorithms in indoor areas. To compute such a map, GP regression
is a suitable tool because it provides predictions of the magnetic field at new
locations along with uncertainty quantification. Because full GP regression has
a complexity that grows cubically with the number of data points,
approximations for GPs have been extensively studied. In this paper, we build
on the structured kernel interpolation (SKI) framework, speeding up inference
by exploiting efficient Krylov subspace methods. More specifically, we
incorporate SKI with derivatives (D-SKI) into the scalar potential model for
magnetic field modeling and compute both predictive mean and covariance with a
complexity that is linear in the data points. In our simulations, we show that
our method achieves better accuracy than current state-of-the-art methods on
magnetic field maps with a growing mapping area. In our large-scale
experiments, we construct magnetic field maps from up to 40000
three-dimensional magnetic field measurements in less than two minutes on a
standard laptop.
Related papers
- High-Dimensional Gaussian Process Regression with Soft Kernel Interpolation [0.8057006406834466]
We introduce Soft Kernel Interpolation (SoftKI) designed for scalable Process (GP) regression on high-dimensional datasets.
Inspired by Structured Interpolation (SKI), which approximates a GP kernel via a structured lattice, SoftKI approximates a kernel via softmax from a smaller number of learned points.
By abandoning the lattice structure used in SKI-based methods, SoftKI separates the cost of forming an approximate GP kernel from the dimensionality of the data, making it well-suited for high-dimensional datasets.
arXiv Detail & Related papers (2024-10-28T18:13:56Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Gaussian process regression and conditional Karhunen-Lo\'{e}ve models
for data assimilation in inverse problems [68.8204255655161]
We present a model inversion algorithm, CKLEMAP, for data assimilation and parameter estimation in partial differential equation models.
The CKLEMAP method provides better scalability compared to the standard MAP method.
arXiv Detail & Related papers (2023-01-26T18:14:12Z) - Spatially scalable recursive estimation of Gaussian process terrain maps
using local basis functions [0.9208007322096532]
Online mapping of nonlinear terrains can be used to improve position estimates when an agent returns to a previously mapped area.
GP mapping algorithms have increasing computational demands as the mapped area expands.
We show experimentally that our algorithm is faster than existing methods when the mapped area is large.
arXiv Detail & Related papers (2022-10-17T15:13:41Z) - Information Entropy Initialized Concrete Autoencoder for Optimal Sensor
Placement and Reconstruction of Geophysical Fields [58.720142291102135]
We propose a new approach to the optimal placement of sensors for reconstructing geophysical fields from sparse measurements.
We demonstrate our method on the two examples: (a) temperature and (b) salinity fields around the Barents Sea and the Svalbard group of islands.
We find out that the obtained optimal sensor locations have clear physical interpretation and correspond to the boundaries between sea currents.
arXiv Detail & Related papers (2022-06-28T12:43:38Z) - Magnetic Field Prediction Using Generative Adversarial Networks [0.0]
We predict magnetic field values at a random point in space by using a generative adversarial network (GAN) structure.
The deep learning (DL) architecture consists of two neural networks: a generator, which predicts missing field values of a given magnetic field, and a critic, which is trained to calculate the statistical distance between real and generated magnetic field distributions.
Our trained generator has learned to predict the missing field values with a median reconstruction test error of 5.14%, when a single coherent region of field points is missing, and 5.86%, when only a few point measurements in space are available.
arXiv Detail & Related papers (2022-03-14T12:31:54Z) - Physics-Informed Machine Learning Method for Large-Scale Data
Assimilation Problems [48.7576911714538]
We extend the physics-informed conditional Karhunen-Lo'eve expansion (PICKLE) method for modeling subsurface flow with unknown flux (Neumann) and varying head (Dirichlet) boundary conditions.
We demonstrate that the PICKLE method is comparable in accuracy with the standard maximum a posteriori (MAP) method, but is significantly faster than MAP for large-scale problems.
arXiv Detail & Related papers (2021-07-30T18:43:14Z) - Gaussian Process Subspace Regression for Model Reduction [7.41244589428771]
Subspace-valued functions arise in a wide range of problems, including parametric reduced order modeling (PROM)
In PROM, each parameter point can be associated with a subspace, which is used for Petrov-Galerkin projections of large system matrices.
We propose a novel Bayesian non model for subspace prediction: the Gaussian Process Subspace regression (GPS) model.
arXiv Detail & Related papers (2021-07-09T20:41:23Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.