Distributed Poisson multi-Bernoulli filtering via generalised covariance intersection
- URL: http://arxiv.org/abs/2506.18397v1
- Date: Mon, 23 Jun 2025 08:32:16 GMT
- Title: Distributed Poisson multi-Bernoulli filtering via generalised covariance intersection
- Authors: Ángel F. García-Fernández, Giorgio Battistelli,
- Abstract summary: This paper presents the distributed Poisson multi-Bernoulli (PMB) filter for distributed multi-object filtering.<n>We approximate the power of a PMB density as an unnormalised PMB density, which corresponds to an upper bound of the PMB density.<n>We show that the result is a Poisson multi-Bernoulli mixture (PMBM), which can be expressed in closed form.
- Score: 4.416774178832304
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents the distributed Poisson multi-Bernoulli (PMB) filter based on the generalised covariance intersection (GCI) fusion rule for distributed multi-object filtering. Since the exact GCI fusion of two PMB densities is intractable, we derive a principled approximation. Specifically, we approximate the power of a PMB density as an unnormalised PMB density, which corresponds to an upper bound of the PMB density. Then, the GCI fusion rule corresponds to the normalised product of two unnormalised PMB densities. We show that the result is a Poisson multi-Bernoulli mixture (PMBM), which can be expressed in closed form. Future prediction and update steps in each filter preserve the PMBM form, which can be projected back to a PMB density before the next fusion step. Experimental results show the benefits of this approach compared to other distributed multi-object filters.
Related papers
- Kernel-Based Ensemble Gaussian Mixture Probability Hypothesis Density Filter [0.0]
The EnGM-PHD filter combines the Gaussian-mixture-based techniques of the GM-PHD filter with the particle-based techniques of the SMC-PHD filter.<n>The results indicate that the EnGM-PHD filter achieves better multi-target filtering performance than both the GM-PHD and SMC-PHD filters.
arXiv Detail & Related papers (2025-04-30T19:00:02Z) - Poisson multi-Bernoulli mixture filter for trajectory measurements [3.604879434384177]
The trajectory measurement PMBM (TM-PMBM) filter propagates a PMBM density on the set of target states.<n>The filter provides a closed-form solution for multi-target filtering based on sets of trajectory measurements.
arXiv Detail & Related papers (2025-04-11T10:27:07Z) - Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - Unraveling the Smoothness Properties of Diffusion Models: A Gaussian Mixture Perspective [18.331374727331077]
We provide a theoretical understanding of the Lipschitz continuity and second momentum properties of the diffusion process.
Our results provide deeper theoretical insights into the dynamics of the diffusion process under common data distributions.
arXiv Detail & Related papers (2024-05-26T03:32:27Z) - Discrete Probabilistic Inference as Control in Multi-path Environments [84.67055173040107]
We consider the problem of sampling from a discrete and structured distribution as a sequential decision problem.
We show that GFlowNets learn a policy that samples objects proportionally to their reward by enforcing a conservation of flows.
We also prove that some flow-matching objectives found in the GFlowNet literature are in fact equivalent to well-established MaxEnt RL algorithms with a corrected reward.
arXiv Detail & Related papers (2024-02-15T20:20:35Z) - Machine-Learned Exclusion Limits without Binning [0.0]
We extend the Machine-Learned Likelihoods (MLL) method by including Kernel Density Estimators (KDE) to extract one-dimensional signal and background probability density functions.
We apply the method to two cases of interest at the LHC: a search for exotic Higgs bosons, and a $Z'$ boson decaying into lepton pairs.
arXiv Detail & Related papers (2022-11-09T11:04:50Z) - Optimality Guarantees for Particle Belief Approximation of POMDPs [55.83001584645448]
Partially observable Markov decision processes (POMDPs) provide a flexible representation for real-world decision and control problems.
POMDPs are notoriously difficult to solve, especially when the state and observation spaces are continuous or hybrid.
We propose a theory characterizing the approximation error of the particle filtering techniques that these algorithms use.
arXiv Detail & Related papers (2022-10-10T21:11:55Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - A Unified Framework for Multi-distribution Density Ratio Estimation [101.67420298343512]
Binary density ratio estimation (DRE) provides the foundation for many state-of-the-art machine learning algorithms.
We develop a general framework from the perspective of Bregman minimization divergence.
We show that our framework leads to methods that strictly generalize their counterparts in binary DRE.
arXiv Detail & Related papers (2021-12-07T01:23:20Z) - Sampling from high-dimensional, multimodal distributions using automatically tuned, tempered Hamiltonian Monte Carlo [0.0]
Hamiltonian Monte Carlo (HMC) is widely used for sampling from high-dimensional target distributions with probability density known up to proportionality.
Traditional tempering methods, commonly used to address multimodality, can be difficult to tune, particularly in high dimensions.
We propose a method that combines a tempering strategy with Hamiltonian Monte Carlo, enabling efficient sampling from high-dimensional, strongly multimodal distributions.
arXiv Detail & Related papers (2021-11-12T18:48:36Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - A Poisson multi-Bernoulli mixture filter for coexisting point and
extended targets [5.949779668853555]
This paper proposes a Poisson multi-Bernoulli mixture (PMBM) filter for coexisting point and extended targets.
As a computationally efficient approximation of the PMBM filter, we also develop a Poisson multi-Bernoulli (PMB) filter for coexisting point and extended targets.
arXiv Detail & Related papers (2020-11-09T14:41:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.