The Elliptical Processes: a Family of Fat-tailed Stochastic Processes
- URL: http://arxiv.org/abs/2003.07201v2
- Date: Wed, 2 Dec 2020 07:27:47 GMT
- Title: The Elliptical Processes: a Family of Fat-tailed Stochastic Processes
- Authors: Maria B{\aa}nkestad, Jens Sj\"olund, Jalil Taghia, Thomas Sch\"on
- Abstract summary: We present the elliptical processes -- a family of non-parametric probabilistic models that subsumes the Gaussian process and the Student-t process.
This generalization includes a range of new fat-tailed behaviors yet retains computational tractability.
- Score: 1.2043574473965317
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present the elliptical processes -- a family of non-parametric
probabilistic models that subsumes the Gaussian process and the Student-t
process. This generalization includes a range of new fat-tailed behaviors yet
retains computational tractability. We base the elliptical processes on a
representation of elliptical distributions as a continuous mixture of Gaussian
distributions and derive closed-form expressions for the marginal and
conditional distributions. We perform numerical experiments on robust
regression using an elliptical process defined by a piecewise constant mixing
distribution, and show advantages compared with a Gaussian process. The
elliptical processes may become a replacement for Gaussian processes in several
settings, including when the likelihood is not Gaussian or when accurate tail
modeling is critical.
Related papers
- Variational Elliptical Processes [1.5703073293718952]
We present elliptical processes, a family of non-parametric probabilistic models that subsume processes and Student's posterior processes.
We parameterize this mixture distribution as a spline normalizing flow, which we train using variational inference.
The proposed form of the variational posterior enables a sparse variational elliptical process applicable to large-scale problems.
arXiv Detail & Related papers (2023-11-21T12:26:14Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - A Heavy-Tailed Algebra for Probabilistic Programming [53.32246823168763]
We propose a systematic approach for analyzing the tails of random variables.
We show how this approach can be used during the static analysis (before drawing samples) pass of a probabilistic programming language compiler.
Our empirical results confirm that inference algorithms that leverage our heavy-tailed algebra attain superior performance across a number of density modeling and variational inference tasks.
arXiv Detail & Related papers (2023-06-15T16:37:36Z) - Mixtures of Gaussian Process Experts with SMC$^2$ [0.4588028371034407]
mixtures of Gaussian process experts have been considered where data points are assigned to independent experts.
We construct a novel inference approach based on nested sequential Monte Carlo samplers to infer both the gating network and Gaussian process expert parameters.
arXiv Detail & Related papers (2022-08-26T18:20:14Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Gaussian Process for Trajectories [17.458493494904992]
We discuss elements that need to be considered when applying Gaussian process to timestamps, common choices for those elements, and provide a concrete example of implementing a Gaussian process.
arXiv Detail & Related papers (2021-10-07T18:02:19Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.