The Ensemble Epanechnikov Mixture Filter
- URL: http://arxiv.org/abs/2408.11164v1
- Date: Tue, 20 Aug 2024 19:50:59 GMT
- Title: The Ensemble Epanechnikov Mixture Filter
- Authors: Andrey A. Popov, Renato Zanetti,
- Abstract summary: We make use of the optimal Epanechnikov mixture kernel density estimate for the sequential filtering scenario through what we term the ensemble Epanechnikov mixture filter (EnEMF)
We show on a static example that the EnEMF is robust to growth in dimension, and also that the EnEMF has a significant reduction in error per particle on the 40-variable Lorenz '96 system.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the high-dimensional setting, Gaussian mixture kernel density estimates become increasingly suboptimal. In this work we aim to show that it is practical to instead use the optimal multivariate Epanechnikov kernel. We make use of this optimal Epanechnikov mixture kernel density estimate for the sequential filtering scenario through what we term the ensemble Epanechnikov mixture filter (EnEMF). We provide a practical implementation of the EnEMF that is as cost efficient as the comparable ensemble Gaussian mixture filter. We show on a static example that the EnEMF is robust to growth in dimension, and also that the EnEMF has a significant reduction in error per particle on the 40-variable Lorenz '96 system.
Related papers
- Spectral Mixture Kernels for Bayesian Optimization [3.8601741392210434]
We introduce a novel Gaussian Process-based BO method that incorporates spectral mixture kernels.<n>This method achieves a significant improvement in both efficiency and optimization performance.<n>We provide bounds on the information gain and cumulative regret associated with obtaining the optimum.
arXiv Detail & Related papers (2025-05-23T02:07:26Z) - Kernel-Based Ensemble Gaussian Mixture Probability Hypothesis Density Filter [0.0]
The EnGM-PHD filter combines the Gaussian-mixture-based techniques of the GM-PHD filter with the particle-based techniques of the SMC-PHD filter.
The results indicate that the EnGM-PHD filter achieves better multi-target filtering performance than both the GM-PHD and SMC-PHD filters.
arXiv Detail & Related papers (2025-04-30T19:00:02Z) - Adversarial Transform Particle Filters [11.330617592263744]
The particle filter (PF) and the ensemble Kalman filter (EnKF) are widely used for approximate inference in state-space models.
We propose the Adversarial Transform Particle Filter (ATPF), a novel filtering framework that combines the strengths of the PF and the EnKF through adversarial learning.
arXiv Detail & Related papers (2025-02-10T05:31:35Z) - Adaptive Fuzzy C-Means with Graph Embedding [84.47075244116782]
Fuzzy clustering algorithms can be roughly categorized into two main groups: Fuzzy C-Means (FCM) based methods and mixture model based methods.
We propose a novel FCM based clustering model that is capable of automatically learning an appropriate membership degree hyper- parameter value.
arXiv Detail & Related papers (2024-05-22T08:15:50Z) - Fast Semisupervised Unmixing Using Nonconvex Optimization [80.11512905623417]
We introduce a novel convex convex model for semi/library-based unmixing.
We demonstrate the efficacy of Alternating Methods of sparse unsupervised unmixing.
arXiv Detail & Related papers (2024-01-23T10:07:41Z) - Deep Gaussian Mixture Ensembles [9.673093148930874]
This work introduces a novel probabilistic deep learning technique called deep Gaussian mixture ensembles (DGMEs)
DGMEs are capable of approximating complex probability distributions, such as heavy-tailed or multimodal distributions.
Our experimental results demonstrate that DGMEs outperform state-of-the-art uncertainty quantifying deep learning models in handling complex predictive densities.
arXiv Detail & Related papers (2023-06-12T16:53:38Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Training Compact CNNs for Image Classification using Dynamic-coded
Filter Fusion [139.71852076031962]
We present a novel filter pruning method, dubbed dynamic-coded filter fusion (DCFF)
We derive compact CNNs in a computation-economical and regularization-free manner for efficient image classification.
Our DCFF derives a compact VGGNet-16 with only 72.77M FLOPs and 1.06M parameters while reaching top-1 accuracy of 93.47%.
arXiv Detail & Related papers (2021-07-14T18:07:38Z) - Fairly Constricted Multi-Objective Particle Swarm Optimization [0.0]
We extend the state of the art Multi-objective optimization (MOO) solver, SMPSO, by incorporating exponentially-averaged momentum (EM) in it.
The proposed solver matches the performance of SMPSO across the ZDT, DTLZ and WFG problem suites and even outperforms it in certain instances.
arXiv Detail & Related papers (2021-04-10T14:39:59Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Ensemble Kalman Variational Objectives: Nonlinear Latent Trajectory
Inference with A Hybrid of Variational Inference and Ensemble Kalman Filter [0.0]
We propose Ensemble Kalman Variational Objective (EnKO) to infer state space models (SSMs)
Our proposed method can efficiently identify latent dynamics because of its particle diversity and unbiased gradient estimators.
We demonstrate that our EnKO outperforms SMC-based methods in terms of predictive ability and particle efficiency for three benchmark nonlinear system identification tasks.
arXiv Detail & Related papers (2020-10-17T07:01:06Z) - Innovative And Additive Outlier Robust Kalman Filtering With A Robust
Particle Filter [68.8204255655161]
We propose CE-BASS, a particle mixture Kalman filter which is robust to both innovative and additive outliers, and able to fully capture multi-modality in the distribution of the hidden state.
Furthermore, the particle sampling approach re-samples past states, which enables CE-BASS to handle innovative outliers which are not immediately visible in the observations, such as trend changes.
arXiv Detail & Related papers (2020-07-07T07:11:09Z) - Gaussian Mixture Reduction with Composite Transportation Divergence [15.687740538194413]
We propose a novel optimization-based GMR method based on composite transportation divergence (CTD)
We develop a majorization-minimization algorithm for computing the reduced mixture and establish its theoretical convergence.
Our unified framework empowers users to select the most appropriate cost function in CTD to achieve superior performance.
arXiv Detail & Related papers (2020-02-19T19:52:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.