Learning Structures in Earth Observation Data with Gaussian Processes
- URL: http://arxiv.org/abs/2012.11922v1
- Date: Tue, 22 Dec 2020 10:46:37 GMT
- Title: Learning Structures in Earth Observation Data with Gaussian Processes
- Authors: Fernando Mateo, Jordi Munoz-Mari, Valero Laparra, Jochem Verrelst,
Gustau Camps-Valls
- Abstract summary: This paper reviews the main theoretical GP developments in the field.
New algorithms that respect the signal and noise characteristics, that provide feature rankings automatically, and that allow applicability of associated uncertainty intervals are discussed.
- Score: 67.27044745471207
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Processes (GPs) has experienced tremendous success in geoscience in
general and for bio-geophysical parameter retrieval in the last years. GPs
constitute a solid Bayesian framework to formulate many function approximation
problems consistently. This paper reviews the main theoretical GP developments
in the field. We review new algorithms that respect the signal and noise
characteristics, that provide feature rankings automatically, and that allow
applicability of associated uncertainty intervals to transport GP models in
space and time. All these developments are illustrated in the field of
geoscience and remote sensing at a local and global scales through a set of
illustrative examples.
Related papers
- Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Surface Warping Incorporating Machine Learning Assisted Domain
Likelihood Estimation: A New Paradigm in Mine Geology Modelling and
Automation [68.8204255655161]
A Bayesian warping technique has been proposed to reshape modeled surfaces based on geochemical and spatial constraints imposed by newly acquired blasthole data.
This paper focuses on incorporating machine learning in this warping framework to make the likelihood generalizable.
Its foundation is laid by a Bayesian computation in which the geological domain likelihood given the chemistry, p(g|c) plays a similar role to p(y(c)|g.
arXiv Detail & Related papers (2021-02-15T10:37:52Z) - Deep Gaussian Processes for geophysical parameter retrieval [15.400481898772158]
This paper introduces deep Gaussian processes (DGPs) for geophysical parameter retrieval.
Unlike the standard full GP model, the DGP accounts for complicated (modular, hierarchical) processes, and improves prediction accuracy over standard full and sparse GP models.
arXiv Detail & Related papers (2020-12-07T14:44:04Z) - Inter-domain Deep Gaussian Processes [45.28237107466283]
We propose an extension of inter-domain shallow GPs that combines the advantages of inter-domain and deep Gaussian processes (DGPs)
We demonstrate how to leverage existing approximate inference methods to perform simple and scalable approximate inference using inter-domain features in DGPs.
arXiv Detail & Related papers (2020-11-01T04:03:35Z) - Sparse Gaussian Process Variational Autoencoders [24.86751422740643]
Existing approaches for performing inference in GP-DGMs do not support sparse GP approximations based on points.
We develop the sparse Gaussian processal variation autoencoder (GP-VAE) characterised by the use of partial inference networks for parameterising sparse GP approximations.
arXiv Detail & Related papers (2020-10-20T10:19:56Z) - A Perspective on Gaussian Processes for Earth Observation [23.66931064985429]
Earth observation by airborne and satellite remote sensing and in-situ observations play a fundamental role in monitoring our planet.
Machine learning and Gaussian processes (GPs) in particular has attained outstanding results in the estimation of bio-geo-physical variables.
Despite great advances in forward and inverse modelling, GP models still have to face important challenges.
arXiv Detail & Related papers (2020-07-02T16:44:11Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z) - Ensemble of Sparse Gaussian Process Experts for Implicit Surface Mapping
with Streaming Data [13.56926815833324]
We learn a compact and continuous implicit surface map of an environment from a stream of range data with known poses.
Instead of inserting all arriving data into the GP models, we greedily trade-off between model complexity and prediction error.
The results show that we can learn compact and accurate implicit surface models under different conditions.
arXiv Detail & Related papers (2020-02-12T11:06:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.