Intrinsic Gaussian Process Regression Modeling for Manifold-valued Response Variable
- URL: http://arxiv.org/abs/2411.18989v2
- Date: Sun, 09 Feb 2025 13:15:02 GMT
- Title: Intrinsic Gaussian Process Regression Modeling for Manifold-valued Response Variable
- Authors: Zhanfeng Wang, Xinyu Li, Hao Ding, Jian Qing Shi,
- Abstract summary: We propose a novel intrinsic Gaussian process regression model for manifold-valued data.<n>We establish the properties of the proposed models, including information consistency and posterior consistency.<n> Numerical studies, including simulation and real examples, indicate that the proposed methods work well.
- Score: 6.137918306133745
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Extrinsic Gaussian process regression methods, such as wrapped Gaussian process, have been developed to analyze manifold data. However, there is a lack of intrinsic Gaussian process methods for studying complex data with manifold-valued response variables. In this paper, we first apply the parallel transport operator on Riemannian manifold to propose an intrinsic covariance structure that addresses a critical aspect of constructing a well-defined Gaussian process regression model. We then propose a novel intrinsic Gaussian process regression model for manifold-valued data, which can be applied to data situated not only on Euclidean submanifolds but also on manifolds without a natural ambient space. We establish the asymptotic properties of the proposed models, including information consistency and posterior consistency, and we also show that the posterior distribution of the regression function is invariant to the choice of orthonormal frames for the coordinate representations of the covariance function. Numerical studies, including simulation and real examples, indicate that the proposed methods work well.
Related papers
- Amortized In-Context Bayesian Posterior Estimation [15.714462115687096]
Amortization, through conditional estimation, is a viable strategy to alleviate such difficulties.
We conduct a thorough comparative analysis of amortized in-context Bayesian posterior estimation methods.
We highlight the superiority of the reverse KL estimator for predictive problems, especially when combined with the transformer architecture and normalizing flows.
arXiv Detail & Related papers (2025-02-10T16:00:48Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Multi-Response Heteroscedastic Gaussian Process Models and Their
Inference [1.52292571922932]
We propose a novel framework for the modeling of heteroscedastic covariance functions.
We employ variational inference to approximate the posterior and facilitate posterior predictive modeling.
We show that our proposed framework offers a robust and versatile tool for a wide array of applications.
arXiv Detail & Related papers (2023-08-29T15:06:47Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Monte Carlo inference for semiparametric Bayesian regression [5.488491124945426]
This paper introduces a simple, general, and efficient strategy for joint posterior inference of an unknown transformation and all regression model parameters.
It delivers (1) joint posterior consistency under general conditions, including multiple model misspecifications, and (2) efficient Monte Carlo (not Markov chain Monte Carlo) inference for the transformation and all parameters for important special cases.
arXiv Detail & Related papers (2023-06-08T18:42:42Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Random Forest Weighted Local Fréchet Regression with Random Objects [18.128663071848923]
We propose a novel random forest weighted local Fr'echet regression paradigm.
Our first method uses these weights as the local average to solve the conditional Fr'echet mean.
Second method performs local linear Fr'echet regression, both significantly improving existing Fr'echet regression methods.
arXiv Detail & Related papers (2022-02-10T09:10:59Z) - Inferring Manifolds From Noisy Data Using Gaussian Processes [17.166283428199634]
Most existing manifold learning algorithms replace the original data with lower dimensional coordinates.
This article proposes a new methodology for addressing these problems, allowing the estimated manifold between fitted data points.
arXiv Detail & Related papers (2021-10-14T15:50:38Z) - Recyclable Gaussian Processes [0.0]
We present a new framework for recycling independent variational approximations to Gaussian processes.
The main contribution is the construction of variational ensembles given a dictionary of fitted Gaussian processes.
Our framework allows for regression, classification and heterogeneous tasks.
arXiv Detail & Related papers (2020-10-06T09:01:55Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Generalized Gumbel-Softmax Gradient Estimator for Various Discrete
Random Variables [16.643346012854156]
Esting the gradients of nodes is one of the crucial research questions in the deep generative modeling community.
This paper proposes a general version of the Gumbel-Softmax estimator with continuous relaxation.
arXiv Detail & Related papers (2020-03-04T01:13:15Z) - Transport Gaussian Processes for Regression [0.22843885788439797]
We propose a methodology to construct processes, which include GPs, warped GPs, Student-t processes and several others.
Our approach is inspired by layers-based models, where each proposed layer changes a specific property over the generated process.
We validate the proposed model through experiments with real-world data.
arXiv Detail & Related papers (2020-01-30T17:44:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.