Theoretical Analysis of Heteroscedastic Gaussian Processes with Posterior Distributions
- URL: http://arxiv.org/abs/2409.12622v1
- Date: Thu, 19 Sep 2024 09:51:46 GMT
- Title: Theoretical Analysis of Heteroscedastic Gaussian Processes with Posterior Distributions
- Authors: Yuji Ito,
- Abstract summary: This study introduces a novel theoretical framework for analyzing heteroscedastic Gaussian processes (HGPs)
It derives the exact means, variances, and cumulative distributions of the posterior distributions.
The derived theoretical findings are applied to a chance-constrained tracking controller.
- Score: 0.4895118383237099
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study introduces a novel theoretical framework for analyzing heteroscedastic Gaussian processes (HGPs) that identify unknown systems in a data-driven manner. Although HGPs effectively address the heteroscedasticity of noise in complex training datasets, calculating the exact posterior distributions of the HGPs is challenging, as these distributions are no longer multivariate normal. This study derives the exact means, variances, and cumulative distributions of the posterior distributions. Furthermore, the derived theoretical findings are applied to a chance-constrained tracking controller. After an HGP identifies an unknown disturbance in a plant system, the controller can handle chance constraints regarding the system despite the presence of the disturbance.
Related papers
- Scalable Variational Causal Discovery Unconstrained by Acyclicity [6.954510776782872]
We propose a scalable Bayesian approach to learn the posterior distribution over causal graphs given observational data.
We introduce a novel differentiable DAG sampling method that can generate a valid acyclic causal graph.
We are able to model the posterior distribution over causal graphs using a simple variational distribution over a continuous domain.
arXiv Detail & Related papers (2024-07-06T07:56:23Z) - Bayesian Causal Inference with Gaussian Process Networks [1.7188280334580197]
We consider the problem of the Bayesian estimation of the effects of hypothetical interventions in the Gaussian Process Network model.
We detail how to perform causal inference on GPNs by simulating the effect of an intervention across the whole network and propagating the effect of the intervention on downstream variables.
We extend both frameworks beyond the case of a known causal graph, incorporating uncertainty about the causal structure via Markov chain Monte Carlo methods.
arXiv Detail & Related papers (2024-02-01T14:39:59Z) - Causality-Based Multivariate Time Series Anomaly Detection [63.799474860969156]
We formulate the anomaly detection problem from a causal perspective and view anomalies as instances that do not follow the regular causal mechanism to generate the multivariate data.
We then propose a causality-based anomaly detection approach, which first learns the causal structure from data and then infers whether an instance is an anomaly relative to the local causal mechanism.
We evaluate our approach with both simulated and public datasets as well as a case study on real-world AIOps applications.
arXiv Detail & Related papers (2022-06-30T06:00:13Z) - A Robust and Flexible EM Algorithm for Mixtures of Elliptical
Distributions with Missing Data [71.9573352891936]
This paper tackles the problem of missing data imputation for noisy and non-Gaussian data.
A new EM algorithm is investigated for mixtures of elliptical distributions with the property of handling potential missing data.
Experimental results on synthetic data demonstrate that the proposed algorithm is robust to outliers and can be used with non-Gaussian data.
arXiv Detail & Related papers (2022-01-28T10:01:37Z) - Investigating Shifts in GAN Output-Distributions [5.076419064097734]
We introduce a loop-training scheme for the systematic investigation of observable shifts between the distributions of real training data and GAN generated data.
Overall, the combination of these methods allows an explorative investigation of innate limitations of current GAN algorithms.
arXiv Detail & Related papers (2021-12-28T09:16:55Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Wasserstein-Splitting Gaussian Process Regression for Heterogeneous
Online Bayesian Inference [9.7471390457395]
We employ variational free energy approximations of GPs operating in tandem with online expectation propagation steps.
We introduce a local splitting step which instantiates a new GP whenever the posterior distribution changes significantly.
Over time, this yields an ensemble of sparse GPs which may be updated incrementally.
arXiv Detail & Related papers (2021-07-26T17:52:46Z) - Decentralized Local Stochastic Extra-Gradient for Variational
Inequalities [125.62877849447729]
We consider distributed variational inequalities (VIs) on domains with the problem data that is heterogeneous (non-IID) and distributed across many devices.
We make a very general assumption on the computational network that covers the settings of fully decentralized calculations.
We theoretically analyze its convergence rate in the strongly-monotone, monotone, and non-monotone settings.
arXiv Detail & Related papers (2021-06-15T17:45:51Z) - Bayesian Uncertainty Estimation of Learned Variational MRI
Reconstruction [63.202627467245584]
We introduce a Bayesian variational framework to quantify the model-immanent (epistemic) uncertainty.
We demonstrate that our approach yields competitive results for undersampled MRI reconstruction.
arXiv Detail & Related papers (2021-02-12T18:08:14Z) - The Hidden Uncertainty in a Neural Networks Activations [105.4223982696279]
The distribution of a neural network's latent representations has been successfully used to detect out-of-distribution (OOD) data.
This work investigates whether this distribution correlates with a model's epistemic uncertainty, thus indicating its ability to generalise to novel inputs.
arXiv Detail & Related papers (2020-12-05T17:30:35Z) - Modulating Scalable Gaussian Processes for Expressive Statistical
Learning [25.356503463916816]
Gaussian process (GP) is interested in learning the statistical relationship between inputs and outputs, since it offers not only the prediction mean but also the associated variability.
This article studies new scalable GP paradigms including the non-stationary heteroscedastic GP, the mixture of GPs and the latent GP, which introduce additional latent variables to modulate the outputs or inputs in order to learn richer, non-Gaussian statistical representation.
arXiv Detail & Related papers (2020-08-29T06:41:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.