Deep Gaussian Processes for geophysical parameter retrieval
- URL: http://arxiv.org/abs/2012.12099v1
- Date: Mon, 7 Dec 2020 14:44:04 GMT
- Title: Deep Gaussian Processes for geophysical parameter retrieval
- Authors: Daniel Heestermans Svendsen, Pablo Morales-\'Alvarez, Rafael Molina,
Gustau Camps-Valls
- Abstract summary: This paper introduces deep Gaussian processes (DGPs) for geophysical parameter retrieval.
Unlike the standard full GP model, the DGP accounts for complicated (modular, hierarchical) processes, and improves prediction accuracy over standard full and sparse GP models.
- Score: 15.400481898772158
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper introduces deep Gaussian processes (DGPs) for geophysical
parameter retrieval. Unlike the standard full GP model, the DGP accounts for
complicated (modular, hierarchical) processes, provides an efficient solution
that scales well to large datasets, and improves prediction accuracy over
standard full and sparse GP models. We give empirical evidence of performance
for estimation of surface dew point temperature from infrared sounding data.
Related papers
- Model-Based Reparameterization Policy Gradient Methods: Theory and
Practical Algorithms [88.74308282658133]
Reization (RP) Policy Gradient Methods (PGMs) have been widely adopted for continuous control tasks in robotics and computer graphics.
Recent studies have revealed that, when applied to long-term reinforcement learning problems, model-based RP PGMs may experience chaotic and non-smooth optimization landscapes.
We propose a spectral normalization method to mitigate the exploding variance issue caused by long model unrolls.
arXiv Detail & Related papers (2023-10-30T18:43:21Z) - Beyond Intuition, a Framework for Applying GPs to Real-World Data [21.504659500727985]
We propose a framework to identify the suitability of GPs to a given problem and how to set up a robust and well-specified GP model.
We apply the framework to a case study of glacier elevation change yielding more accurate results at test time.
arXiv Detail & Related papers (2023-07-06T16:08:47Z) - Interactive Segmentation as Gaussian Process Classification [58.44673380545409]
Click-based interactive segmentation (IS) aims to extract the target objects under user interaction.
Most of the current deep learning (DL)-based methods mainly follow the general pipelines of semantic segmentation.
We propose to formulate the IS task as a Gaussian process (GP)-based pixel-wise binary classification model on each image.
arXiv Detail & Related papers (2023-02-28T14:01:01Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Conditional Deep Gaussian Processes: empirical Bayes hyperdata learning [6.599344783327054]
We propose a conditional Deep Gaussian Process (DGP) in which the intermediate GPs in hierarchical composition are supported by the hyperdata.
We show the equivalence with the deep kernel learning in the limit of dense hyperdata in latent space.
Preliminary extrapolation results demonstrate expressive power of the proposed model compared with GP kernel composition, DGP variational inference, and deep kernel learning.
arXiv Detail & Related papers (2021-10-01T17:50:48Z) - Deep Gaussian Processes for Biogeophysical Parameter Retrieval and Model
Inversion [14.097477944789484]
This paper introduces the use of deep Gaussian Processes (DGPs) for bio-geo-physical model inversion.
Unlike shallow GP models, DGPs account for complicated (modular, hierarchical) processes, provide an efficient solution that scales well to big datasets.
arXiv Detail & Related papers (2021-04-16T10:42:01Z) - Learning Structures in Earth Observation Data with Gaussian Processes [67.27044745471207]
This paper reviews the main theoretical GP developments in the field.
New algorithms that respect the signal and noise characteristics, that provide feature rankings automatically, and that allow applicability of associated uncertainty intervals are discussed.
arXiv Detail & Related papers (2020-12-22T10:46:37Z) - Warped Gaussian Processes in Remote Sensing Parameter Estimation and
Causal Inference [7.811118301686077]
We show the good performance of the proposed model for the estimation of oceanic chlorophyll content from multispectral data.
The model consistently performs better than the standard GP and the more advanced heteroscedastic GP model.
arXiv Detail & Related papers (2020-12-09T09:02:59Z) - On Signal-to-Noise Ratio Issues in Variational Inference for Deep
Gaussian Processes [55.62520135103578]
We show that the gradient estimates used in training Deep Gaussian Processes (DGPs) with importance-weighted variational inference are susceptible to signal-to-noise ratio (SNR) issues.
We show that our fix can lead to consistent improvements in the predictive performance of DGP models.
arXiv Detail & Related papers (2020-11-01T14:38:02Z) - Ensemble of Sparse Gaussian Process Experts for Implicit Surface Mapping
with Streaming Data [13.56926815833324]
We learn a compact and continuous implicit surface map of an environment from a stream of range data with known poses.
Instead of inserting all arriving data into the GP models, we greedily trade-off between model complexity and prediction error.
The results show that we can learn compact and accurate implicit surface models under different conditions.
arXiv Detail & Related papers (2020-02-12T11:06:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.