Particle Filter Bridge Interpolation
- URL: http://arxiv.org/abs/2103.14963v1
- Date: Sat, 27 Mar 2021 18:33:00 GMT
- Title: Particle Filter Bridge Interpolation
- Authors: Adam Lindhe, Carl Ringqvist and Henrik Hult
- Abstract summary: We build on a previously introduced method for generating dimension independents.
We introduce a discriminator network that accurately identifies areas of high representation density.
The resulting sampling procedure allows for greater variability in paths and stronger drift towards areas of high data density.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Auto encoding models have been extensively studied in recent years. They
provide an efficient framework for sample generation, as well as for analysing
feature learning. Furthermore, they are efficient in performing interpolations
between data-points in semantically meaningful ways. In this paper, we build
further on a previously introduced method for generating canonical, dimension
independent, stochastic interpolations. Here, the distribution of interpolation
paths is represented as the distribution of a bridge process constructed from
an artificial random data generating process in the latent space, having the
prior distribution as its invariant distribution. As a result the stochastic
interpolation paths tend to reside in regions of the latent space where the
prior has high mass. This is a desirable feature since, generally, such areas
produce semantically meaningful samples. In this paper, we extend the bridge
process method by introducing a discriminator network that accurately
identifies areas of high latent representation density. The discriminator
network is incorporated as a change of measure of the underlying bridge process
and sampling of interpolation paths is implemented using sequential Monte
Carlo. The resulting sampling procedure allows for greater variability in
interpolation paths and stronger drift towards areas of high data density.
Related papers
- A Bayesian Approach Toward Robust Multidimensional Ellipsoid-Specific Fitting [0.0]
This work presents a novel and effective method for fitting multidimensional ellipsoids to scattered data in the contamination of noise and outliers.
We incorporate a uniform prior distribution to constrain the search for primitive parameters within an ellipsoidal domain.
We apply it to a wide range of practical applications such as microscopy cell counting, 3D reconstruction, geometric shape approximation, and magnetometer calibration tasks.
arXiv Detail & Related papers (2024-07-27T14:31:51Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - An attempt to generate new bridge types from latent space of generative
flow [2.05750372679553]
The basic principle of normalizing flow is introduced in a simple and concise manner.
Treating the dataset as a sample from the population, obtaining normalizing flow is essentially through sampling surveys.
Using symmetric structured image dataset of three-span beam bridge, arch bridge, cable-stayed bridge and reversible suspension bridge, constructing and normalizing flow based on the Glow API in the library.
arXiv Detail & Related papers (2024-01-18T06:26:44Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Efficient Large-scale Nonstationary Spatial Covariance Function
Estimation Using Convolutional Neural Networks [3.5455896230714194]
We use ConvNets to derive subregions from the nonstationary data.
We employ a selection mechanism to identify subregions that exhibit similar behavior to stationary fields.
We assess the performance of the proposed method with synthetic and real datasets at a large scale.
arXiv Detail & Related papers (2023-06-20T12:17:46Z) - Normalizing flow sampling with Langevin dynamics in the latent space [12.91637880428221]
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set.
Since standard NF implement differentiable maps, they may suffer from pathological behaviors when targeting complex distributions.
This paper proposes a new Markov chain Monte Carlo algorithm to sample from the target distribution in the latent domain before transporting it back to the target domain.
arXiv Detail & Related papers (2023-05-20T09:31:35Z) - Conditioning Normalizing Flows for Rare Event Sampling [61.005334495264194]
We propose a transition path sampling scheme based on neural-network generated configurations.
We show that this approach enables the resolution of both the thermodynamics and kinetics of the transition region.
arXiv Detail & Related papers (2022-07-29T07:56:10Z) - Density Ratio Estimation via Infinitesimal Classification [85.08255198145304]
We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
arXiv Detail & Related papers (2021-11-22T06:26:29Z) - Deep Shells: Unsupervised Shape Correspondence with Optimal Transport [52.646396621449]
We propose a novel unsupervised learning approach to 3D shape correspondence.
We show that the proposed method significantly improves over the state-of-the-art on multiple datasets.
arXiv Detail & Related papers (2020-10-28T22:24:07Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z) - Diversity sampling is an implicit regularization for kernel methods [13.136143245702915]
We show that Nystr"om kernel regression with diverse landmarks increases the accuracy of the regression in sparser regions of the dataset.
A greedy is also proposed to select diverse samples of significant size within large datasets when exact DPP sampling is not practically feasible.
arXiv Detail & Related papers (2020-02-20T08:24:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.