Uncovering Regions of Maximum Dissimilarity on Random Process Data
- URL: http://arxiv.org/abs/2209.05569v1
- Date: Mon, 12 Sep 2022 19:44:49 GMT
- Title: Uncovering Regions of Maximum Dissimilarity on Random Process Data
- Authors: Miguel de Carvalho and Gabriel Martos Venturini
- Abstract summary: This paper proposes a method that learns about regions with a certain volume, where the marginal attributes of two processes are less similar.
The proposed methods are devised in full generality for the setting where the data of interest are themselves processes.
We showcase their application with case studies on criminology, finance, and medicine.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The comparison of local characteristics of two random processes can shed
light on periods of time or space at which the processes differ the most. This
paper proposes a method that learns about regions with a certain volume, where
the marginal attributes of two processes are less similar. The proposed methods
are devised in full generality for the setting where the data of interest are
themselves stochastic processes, and thus the proposed method can be used for
pointing out the regions of maximum dissimilarity with a certain volume, in the
contexts of functional data, time series, and point processes. The parameter
functions underlying both stochastic processes of interest are modeled via a
basis representation, and Bayesian inference is conducted via an integrated
nested Laplace approximation. The numerical studies validate the proposed
methods, and we showcase their application with case studies on criminology,
finance, and medicine.
Related papers
- Markov Decision Processes with Noisy State Observation [0.0]
This paper addresses the challenge of a particular class of noisy state observations in Markov Decision Processes (MDPs)
We focus on modeling this uncertainty through a confusion matrix that captures the probabilities of misidentifying the true state.
We propose two novel algorithmic approaches to estimate the inherent measurement noise.
arXiv Detail & Related papers (2023-12-13T21:50:38Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Sequential Estimation of Gaussian Process-based Deep State-Space Models [1.760402297380953]
We consider the problem of sequential estimation of the unknowns of state-space and deep state-space models.
We present a method based on particle filtering where the parameters of the random feature-based Gaussian processes are integrated out.
We show that the method can track the latent processes up to a scale and rotation.
arXiv Detail & Related papers (2023-01-29T20:01:09Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - On Contrastive Representations of Stochastic Processes [53.21653429290478]
Learning representations of processes is an emerging problem in machine learning.
We show that our methods are effective for learning representations of periodic functions, 3D objects and dynamical processes.
arXiv Detail & Related papers (2021-06-18T11:00:24Z) - Sparse Algorithms for Markovian Gaussian Processes [18.999495374836584]
Sparse Markovian processes combine the use of inducing variables with efficient Kalman filter-likes recursion.
We derive a general site-based approach to approximate the non-Gaussian likelihood with local Gaussian terms, called sites.
Our approach results in a suite of novel sparse extensions to algorithms from both the machine learning and signal processing, including variational inference, expectation propagation, and the classical nonlinear Kalman smoothers.
The derived methods are suited to literature-temporal data, where the model has separate inducing points in both time and space.
arXiv Detail & Related papers (2021-03-19T09:50:53Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Spatio-temporal Sequence Prediction with Point Processes and
Self-organizing Decision Trees [0.0]
We study the partitioning-temporal prediction problem introduce a point-process-based prediction algorithm.
Our algorithm can jointly learn the spatial event and the interaction between these regions through a gradient-based optimization procedure.
We compare our approach with state-of-the-art deep learning-based approaches, where we achieve significant performance improvements.
arXiv Detail & Related papers (2020-06-25T14:04:55Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Doubly Sparse Variational Gaussian Processes [14.209730729425502]
We show that the inducing point framework is still valid for state space models and that it can bring further computational and memory savings.
This work makes it possible to use the state-space formulation inside deep Gaussian process models as illustrated in one of the experiments.
arXiv Detail & Related papers (2020-01-15T15:07:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.