Differentiable Particle Filtering without Modifying the Forward Pass
- URL: http://arxiv.org/abs/2106.10314v1
- Date: Fri, 18 Jun 2021 18:58:52 GMT
- Title: Differentiable Particle Filtering without Modifying the Forward Pass
- Authors: Adam \'Scibior, Vaden Masrani, Frank Wood
- Abstract summary: We show how to obtain unbiased estimators of the gradient of the marginal likelihood by only modifying messages used in backpropagation.
We call it stop-gradient resampling, since it can easily be implemented with automatic differentiation libraries.
- Score: 21.430102374292666
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In recent years particle filters have being used as components in systems
optimized end-to-end with gradient descent. However, the resampling step in a
particle filter is not differentiable, which biases gradients and interferes
with optimization. To remedy this problem, several differentiable variants of
resampling have been proposed, all of which modify the behavior of the particle
filter in significant and potentially undesirable ways. In this paper, we show
how to obtain unbiased estimators of the gradient of the marginal likelihood by
only modifying messages used in backpropagation, leaving the standard forward
pass of a particle filter unchanged. Our method is simple to implement, has a
low computational overhead, does not introduce additional hyperparameters, and
extends to derivatives of higher orders. We call it stop-gradient resampling,
since it can easily be implemented with automatic differentiation libraries
using the stop-gradient operator instead of explicitly modifying the backward
messages.
Related papers
- Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Implicit Maximum a Posteriori Filtering via Adaptive Optimization [4.767884267554628]
We frame the standard Bayesian filtering problem as optimization over a time-varying objective.
We show that our framework results in filters that are effective, robust, and scalable to high-dimensional systems.
arXiv Detail & Related papers (2023-11-17T15:30:44Z) - Nonlinear Filtering with Brenier Optimal Transport Maps [4.745059103971596]
This paper is concerned with the problem of nonlinear filtering, i.e., computing the conditional distribution of the state of a dynamical system.
Conventional sequential importance resampling (SIR) particle filters suffer from fundamental limitations, in scenarios involving degenerate likelihoods or high-dimensional states.
In this paper, we explore an alternative method, which is based on estimating the Brenier optimal transport (OT) map from the current prior distribution of the state to the posterior distribution at the next time step.
arXiv Detail & Related papers (2023-10-21T01:34:30Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Reverse image filtering using total derivative approximation and
accelerated gradient descent [82.93345261434943]
We address a new problem of reversing the effect of an image filter, which can be linear or nonlinear.
The assumption is that the algorithm of the filter is unknown and the filter is available as a black box.
We formulate this inverse problem as minimizing a local patch-based cost function and use total derivative to approximate the gradient which is used in gradient descent to solve the problem.
arXiv Detail & Related papers (2021-12-08T05:16:11Z) - Variational Marginal Particle Filters [38.94802937100392]
Variational inference for state space models (SSMs) is known to be hard in general.
Recent works focus on deriving variational objectives for SSMs from unbiased sequential Monte Carlo estimators.
We propose the variational marginal particle filter (VMPF)
arXiv Detail & Related papers (2021-09-30T13:55:16Z) - Fourier Series Expansion Based Filter Parametrization for Equivariant
Convolutions [73.33133942934018]
2D filter parametrization technique plays an important role when designing equivariant convolutions.
New equivariant convolution method based on the proposed filter parametrization method, named F-Conv.
F-Conv evidently outperforms previous filter parametrization based method in image super-resolution task.
arXiv Detail & Related papers (2021-07-30T10:01:52Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.