Learning Transfer Operators by Kernel Density Estimation
- URL: http://arxiv.org/abs/2210.03124v3
- Date: Thu, 27 Jul 2023 15:18:02 GMT
- Title: Learning Transfer Operators by Kernel Density Estimation
- Authors: Sudam Surasinghe, Jeremie Fish and Erik M. Bollt
- Abstract summary: We recast the problem within the framework of statistical density estimation.
We demonstrate the validity and effectiveness of this approach in estimating the eigenvectors of the Frobenius-Perron operator.
We suggest the possibility of incorporating other density estimation methods into this field.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inference of transfer operators from data is often formulated as a classical
problem that hinges on the Ulam method. The conventional description, known as
the Ulam-Galerkin method, involves projecting onto basis functions represented
as characteristic functions supported over a fine grid of rectangles. From this
perspective, the Ulam-Galerkin approach can be interpreted as density
estimation using the histogram method. In this study, we recast the problem
within the framework of statistical density estimation. This alternative
perspective allows for an explicit and rigorous analysis of bias and variance,
thereby facilitating a discussion on the mean square error. Through
comprehensive examples utilizing the logistic map and a Markov map, we
demonstrate the validity and effectiveness of this approach in estimating the
eigenvectors of the Frobenius-Perron operator. We compare the performance of
Histogram Density Estimation(HDE) and Kernel Density Estimation(KDE) methods
and find that KDE generally outperforms HDE in terms of accuracy. However, it
is important to note that KDE exhibits limitations around boundary points and
jumps. Based on our research findings, we suggest the possibility of
incorporating other density estimation methods into this field and propose
future investigations into the application of KDE-based estimation for
high-dimensional maps. These findings provide valuable insights for researchers
and practitioners working on estimating the Frobenius-Perron operator and
highlight the potential of density estimation techniques in this area of study.
Keywords: Transfer Operators; Frobenius-Perron operator; probability density
estimation; Ulam-Galerkin method; Kernel Density Estimation; Histogram Density
Estimation.
Related papers
- Tracking Dynamic Gaussian Density with a Theoretically Optimal Sliding
Window Approach [16.45003200150227]
We study the exact mean integrated squared error (MISE) of "sliding window" Kernel Density Estimators for evolving densities.
We present empirical evidence with synthetic datasets to show that our weighting scheme improves the tracking performance.
arXiv Detail & Related papers (2024-03-11T23:21:26Z) - Variational Weighting for Kernel Density Ratios [11.555375654882525]
Kernel density estimation (KDE) is integral to a range of generative and discriminative tasks in machine learning.
We derive an optimal weight function that reduces bias in standard kernel density estimates for density ratios, leading to improved estimates of prediction posteriors and information-theoretic measures.
arXiv Detail & Related papers (2023-11-06T10:12:19Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Fast Kernel Density Estimation with Density Matrices and Random Fourier
Features [0.0]
kernels density estimation (KDE) is one of the most widely used nonparametric density estimation methods.
DMKDE uses density matrices, a quantum mechanical formalism, and random Fourier features, an explicit kernel approximation, to produce density estimates.
DMKDE is on par with its competitors for computing density estimates and advantages are shown when performed on high-dimensional data.
arXiv Detail & Related papers (2022-08-02T02:11:10Z) - Quantum Adaptive Fourier Features for Neural Density Estimation [0.0]
This paper presents a method for neural density estimation that can be seen as a type of kernel density estimation.
The method is based on density matrices, a formalism used in quantum mechanics, and adaptive Fourier features.
The method was evaluated in different synthetic and real datasets, and its performance compared against state-of-the-art neural density estimation methods.
arXiv Detail & Related papers (2022-08-01T01:39:11Z) - TAKDE: Temporal Adaptive Kernel Density Estimator for Real-Time Dynamic
Density Estimation [16.45003200150227]
We name the temporal adaptive kernel density estimator (TAKDE)
TAKDE is theoretically optimal in terms of the worst-case AMISE.
We provide numerical experiments using synthetic and real-world datasets, showing that TAKDE outperforms other state-of-the-art dynamic density estimators.
arXiv Detail & Related papers (2022-03-15T23:38:32Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - GBHT: Gradient Boosting Histogram Transform for Density Estimation [73.94900378709023]
We propose a density estimation algorithm called textitGradient Boosting Histogram Transform (GBHT)
We make the first attempt to theoretically explain why boosting can enhance the performance of its base learners for density estimation problems.
arXiv Detail & Related papers (2021-06-10T13:40:28Z) - Nonparametric Density Estimation from Markov Chains [68.8204255655161]
We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
arXiv Detail & Related papers (2020-09-08T18:33:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.