Additive Poisson Process: Learning Intensity of Higher-Order Interaction
in Stochastic Processes
- URL: http://arxiv.org/abs/2006.08982v1
- Date: Tue, 16 Jun 2020 08:25:36 GMT
- Title: Additive Poisson Process: Learning Intensity of Higher-Order Interaction
in Stochastic Processes
- Authors: Simon Luo, Feng Zhou, Lamiae Azizi and Mahito Sugiyama
- Abstract summary: We present the Additive Poisson Process (APP), a novel framework that can model the higher-order interaction effects of the intensity functions in processes using lower dimensional projections.
Our model combines the techniques in information geometry to model higher-order interactions on a statistical manifold and in generalized additive models to use lower-dimensional projections to overcome the effects from the curse of dimensionality.
- Score: 10.439638982101181
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present the Additive Poisson Process (APP), a novel framework that can
model the higher-order interaction effects of the intensity functions in
stochastic processes using lower dimensional projections. Our model combines
the techniques in information geometry to model higher-order interactions on a
statistical manifold and in generalized additive models to use
lower-dimensional projections to overcome the effects from the curse of
dimensionality. Our approach solves a convex optimization problem by minimizing
the KL divergence from a sample distribution in lower dimensional projections
to the distribution modeled by an intensity function in the stochastic process.
Our empirical results show that our model is able to use samples observed in
the lower dimensional space to estimate the higher-order intensity function
with extremely sparse observations.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Probabilistic Reduced-Dimensional Vector Autoregressive Modeling with
Oblique Projections [0.7614628596146602]
We propose a reduced-dimensional vector autoregressive model to extract low-dimensional dynamics from noisy data.
An optimal oblique decomposition is derived for the best predictability regarding prediction error covariance.
The superior performance and efficiency of the proposed approach are demonstrated using data sets from a synthesized Lorenz system and an industrial process from Eastman Chemical.
arXiv Detail & Related papers (2024-01-14T05:38:10Z) - Subsurface Characterization using Ensemble-based Approaches with Deep
Generative Models [2.184775414778289]
Inverse modeling is limited for ill-posed, high-dimensional applications due to computational costs and poor prediction accuracy with sparse datasets.
We combine Wasserstein Geneversarative Adrial Network with Gradient Penalty (WGAN-GP) and Ensemble Smoother with Multiple Data Assimilation (ES-MDA)
WGAN-GP is trained to generate high-dimensional K fields from a low-dimensional latent space and ES-MDA updates the latent variables by assimilating available measurements.
arXiv Detail & Related papers (2023-10-02T01:27:10Z) - Fast Diffusion EM: a diffusion model for blind inverse problems with
application to deconvolution [0.0]
Current methods assume the degradation to be known and provide impressive results in terms of restoration and diversity.
In this work, we leverage the efficiency of those models to jointly estimate the restored image and unknown parameters of the kernel model.
Our method alternates between approximating the expected log-likelihood of the problem using samples drawn from a diffusion model and a step to estimate unknown model parameters.
arXiv Detail & Related papers (2023-09-01T06:47:13Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - A Model for Multi-View Residual Covariances based on Perspective
Deformation [88.21738020902411]
We derive a model for the covariance of the visual residuals in multi-view SfM, odometry and SLAM setups.
We validate our model with synthetic and real data and integrate it into photometric and feature-based Bundle Adjustment.
arXiv Detail & Related papers (2022-02-01T21:21:56Z) - Estimating Divergences in High Dimensions [6.172809837529207]
We propose the use of decomposable models for estimating divergences in high dimensional data.
These allow us to factorize the estimated density of the high-dimensional distribution into a product of lower dimensional functions.
We show empirically that estimating the Kullback-Leibler divergence using decomposable models from a maximum likelihood estimator outperforms existing methods for divergence estimation.
arXiv Detail & Related papers (2021-12-08T20:37:28Z) - Gaussian Function On Response Surface Estimation [12.35564140065216]
We propose a new framework for interpreting (features and samples) black-box machine learning models via a metamodeling technique.
The metamodel can be estimated from data generated via a trained complex model by running the computer experiment on samples of data in the region of interest.
arXiv Detail & Related papers (2021-01-04T04:47:00Z) - Fast approximations in the homogeneous Ising model for use in scene
analysis [61.0951285821105]
We provide accurate approximations that make it possible to numerically calculate quantities needed in inference.
We show that our approximation formulae are scalable and unfazed by the size of the Markov Random Field.
The practical import of our approximation formulae is illustrated in performing Bayesian inference in a functional Magnetic Resonance Imaging activation detection experiment, and also in likelihood ratio testing for anisotropy in the spatial patterns of yearly increases in pistachio tree yields.
arXiv Detail & Related papers (2017-12-06T14:24:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.