Filtration learning in exact multi-parameter persistent homology and classification of time-series data
- URL: http://arxiv.org/abs/2406.19587v2
- Date: Sun, 06 Oct 2024 06:53:39 GMT
- Title: Filtration learning in exact multi-parameter persistent homology and classification of time-series data
- Authors: Keunsu Kim, Jae-Hun Jung,
- Abstract summary: We propose a framework for filtration learning of EMPH.
We derive the exact formula of the gradient of the loss function with respect to the filtration parameters.
- Score: 3.193388094899312
- License:
- Abstract: To analyze the topological properties of the given discrete data, one needs to consider a continuous transform called filtration. Persistent homology serves as a tool to track changes of homology in the filtration. The outcome of the topological analysis of data varies depending on the choice of filtration, making the selection of filtration crucial. Filtration learning is an attempt to find an optimal filtration that minimizes the loss function. Exact Multi-parameter Persistent Homology (EMPH) has been recently proposed, particularly for topological time-series analysis, that utilizes the exact formula of rank invariant instead of calculating it. In this paper, we propose a framework for filtration learning of EMPH. We formulate an optimization problem and propose an algorithm for solving the problem. We then apply the proposed algorithm to several classification problems. Particularly, we derive the exact formula of the gradient of the loss function with respect to the filtration parameters, which makes it possible to directly update the filtration without using automatic differentiation, significantly enhancing the learning process.
Related papers
- Learning Optimal Filters Using Variational Inference [0.3749861135832072]
We present a framework for learning a parameterized analysis map - the map that takes a forecast distribution and observations to the filtering distribution.
We show that this methodology can be used to learn gain matrices for filtering linear and nonlinear dynamical systems.
Future work will apply this framework to learn new filtering algorithms.
arXiv Detail & Related papers (2024-06-26T04:51:14Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Implicit Maximum a Posteriori Filtering via Adaptive Optimization [4.767884267554628]
We frame the standard Bayesian filtering problem as optimization over a time-varying objective.
We show that our framework results in filters that are effective, robust, and scalable to high-dimensional systems.
arXiv Detail & Related papers (2023-11-17T15:30:44Z) - An Ensemble Score Filter for Tracking High-Dimensional Nonlinear Dynamical Systems [10.997994515823798]
We propose an ensemble score filter (EnSF) for solving high-dimensional nonlinear filtering problems.
Unlike existing diffusion models that train neural networks to approximate the score function, we develop a training-free score estimation.
EnSF provides surprising performance, compared with the state-of-the-art Local Ensemble Transform Kalman Filter method.
arXiv Detail & Related papers (2023-09-02T16:48:02Z) - Adaptive Topological Feature via Persistent Homology: Filtration
Learning for Point Clouds [13.098609653951893]
We propose a framework that learns a filtration adaptively with the use of neural networks.
We show a theoretical result on a finite-dimensional approximation of filtration functions, which justifies the proposed network architecture.
arXiv Detail & Related papers (2023-07-18T13:43:53Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Fourier Series Expansion Based Filter Parametrization for Equivariant
Convolutions [73.33133942934018]
2D filter parametrization technique plays an important role when designing equivariant convolutions.
New equivariant convolution method based on the proposed filter parametrization method, named F-Conv.
F-Conv evidently outperforms previous filter parametrization based method in image super-resolution task.
arXiv Detail & Related papers (2021-07-30T10:01:52Z) - Stability to Deformations of Manifold Filters and Manifold Neural Networks [89.53585099149973]
The paper defines and studies manifold (M) convolutional filters and neural networks (NNs)
The main technical contribution of the paper is to analyze the stability of manifold filters and MNNs to smooth deformations of the manifold.
arXiv Detail & Related papers (2021-06-07T15:41:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.