Higher Order Kernel Mean Embeddings to Capture Filtrations of Stochastic
Processes
- URL: http://arxiv.org/abs/2109.03582v1
- Date: Wed, 8 Sep 2021 12:27:25 GMT
- Title: Higher Order Kernel Mean Embeddings to Capture Filtrations of Stochastic
Processes
- Authors: Cristopher Salvi, Maud Lemercier, Chong Liu, Blanka Hovarth, Theodoros
Damoulas, Terry Lyons
- Abstract summary: We introduce a family of higher order kernel mean embeddings that generalizes the notion of KME.
We derive empirical estimators for the associated higher order maximum mean discrepancies (MMDs) and prove consistency.
We construct a family of universal kernels on processes that allows to solve real-world calibration and optimal stopping problems.
- Score: 11.277354787690646
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic processes are random variables with values in some space of paths.
However, reducing a stochastic process to a path-valued random variable ignores
its filtration, i.e. the flow of information carried by the process through
time. By conditioning the process on its filtration, we introduce a family of
higher order kernel mean embeddings (KMEs) that generalizes the notion of KME
and captures additional information related to the filtration. We derive
empirical estimators for the associated higher order maximum mean discrepancies
(MMDs) and prove consistency. We then construct a filtration-sensitive kernel
two-sample test able to pick up information that gets missed by the standard
MMD test. In addition, leveraging our higher order MMDs we construct a family
of universal kernels on stochastic processes that allows to solve real-world
calibration and optimal stopping problems in quantitative finance (such as the
pricing of American options) via classical kernel-based regression methods.
Finally, adapting existing tests for conditional independence to the case of
stochastic processes, we design a causal-discovery algorithm to recover the
causal graph of structural dependencies among interacting bodies solely from
observations of their multidimensional trajectories.
Related papers
- Inferring biological processes with intrinsic noise from cross-sectional data [0.8192907805418583]
Inferring dynamical models from data continues to be a significant challenge in computational biology.
We show that probability flow inference (PFI) disentangles force from intrinsicity while retaining the algorithmic ease of ODE inference.
In practical applications, we show that PFI enables accurate parameter and force estimation in high-dimensional reaction networks, and that it allows inference of cell differentiation dynamics with molecular noise.
arXiv Detail & Related papers (2024-10-10T00:33:25Z) - Rethinking the Diffusion Models for Numerical Tabular Data Imputation from the Perspective of Wasserstein Gradient Flow [13.109101873881063]
We introduce a principled approach termed Kernelized Negative Entropy-regularized Wasserstein gradient flow Imputation (KnewImp)
Our proposed KnewImp approach significantly outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-06-22T06:59:32Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Quantile Filtered Imitation Learning [49.11859771578969]
quantile filtered imitation learning (QFIL) is a policy improvement operator designed for offline reinforcement learning.
We prove that QFIL gives us a safe policy improvement step with function approximation.
We see that QFIL performs well on the D4RL benchmark.
arXiv Detail & Related papers (2021-12-02T03:08:23Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - An adaptive Hessian approximated stochastic gradient MCMC method [12.93317525451798]
We present an adaptive Hessian approximated gradient MCMC method to incorporate local geometric information while sampling from the posterior.
We adopt a magnitude-based weight pruning method to enforce the sparsity of the network.
arXiv Detail & Related papers (2020-10-03T16:22:15Z) - Kernel Autocovariance Operators of Stationary Processes: Estimation and
Convergence [0.5505634045241288]
We consider autocovariance operators of a stationary process on a Polish space embedded into a kernel reproducing Hilbert space.
We investigate how empirical estimates of these operators converge along realizations of the process under various conditions.
We provide applications of our theory in terms of consistency results for kernel PCA with dependent data and the conditional mean embedding of transition probabilities.
arXiv Detail & Related papers (2020-04-02T09:17:32Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.