Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data
- URL: http://arxiv.org/abs/2502.16049v1
- Date: Sat, 22 Feb 2025 02:53:26 GMT
- Title: Quasi Zigzag Persistence: A Topological Framework for Analyzing Time-Varying Data
- Authors: Tamal K. Dey, Shreyas N. Samaga,
- Abstract summary: We propose Quasi Zigzag Persistent Homology (QZPH) as a framework for analyzing time-varying data.<n>We introduce a stable topological invariant that captures both static and dynamic features at different scales.<n>We show that it enhances the machine learning models when applied to tasks such as sleep-stage detection.
- Score: 0.25322020135765466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose Quasi Zigzag Persistent Homology (QZPH) as a framework for analyzing time-varying data by integrating multiparameter persistence and zigzag persistence. To this end, we introduce a stable topological invariant that captures both static and dynamic features at different scales. We present an algorithm to compute this invariant efficiently. We show that it enhances the machine learning models when applied to tasks such as sleep-stage detection, demonstrating its effectiveness in capturing the evolving patterns in time-evolving datasets.
Related papers
- Spatiodynamic inference using vision-based generative modelling [0.5461938536945723]
We develop a simulation-based inference framework that employs vision transformer-driven encoded variational representations.<n>The central idea is to construct a fine-grained, structured mesh of latent dynamics through systematic exploration of the parameter space.<n>By integrating generative modeling with mechanistic principles, our approach provides a unified inference framework.
arXiv Detail & Related papers (2025-07-29T22:10:50Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [55.09326865401653]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - Persistent Topological Features in Large Language Models [0.6597195879147556]
We introduce topological descriptors that measure how topological features, $p$-dimensional holes, persist and evolve throughout the layers.<n>This offers a statistical perspective on how prompts are rearranged and their relative positions changed in the representation space.<n>As a showcase application, we use zigzag persistence to establish a criterion for layer pruning, achieving results comparable to state-of-the-art methods.
arXiv Detail & Related papers (2024-10-14T19:46:23Z) - Neural Persistence Dynamics [8.197801260302642]
We consider the problem of learning the dynamics in the topology of time-evolving point clouds.
Our proposed model - $textitNeural Persistence Dynamics$ - substantially outperforms the state-of-the-art across a diverse set of parameter regression tasks.
arXiv Detail & Related papers (2024-05-24T17:20:18Z) - Equivariant Spatio-Temporal Attentive Graph Networks to Simulate Physical Dynamics [32.115887916401036]
We develop an equivariant version of Fourier-temporal GNNs to represent and simulate dynamics of physical systems.
We evaluate our model on three real datasets corresponding to the molecular-, protein- and macro-level.
arXiv Detail & Related papers (2024-05-21T15:33:21Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Latent variable model for high-dimensional point process with structured missingness [4.451479907610764]
Longitudinal data are important in numerous fields, such as healthcare, sociology and seismology.
Real-world datasets can be high-dimensional, contain structured missingness patterns, and measurement time points can be governed by an unknown process.
We propose a flexible and efficient latent-variable model that is capable of addressing all these limitations.
arXiv Detail & Related papers (2024-02-08T15:41:48Z) - Topological Quality of Subsets via Persistence Matching Diagrams [0.196629787330046]
We measure the quality of a subset concerning the dataset it represents using topological data analysis techniques.
In particular, this approach enables us to explain why the chosen subset is likely to result in poor performance of a supervised learning model.
arXiv Detail & Related papers (2023-06-04T17:08:41Z) - Neural Continuous-Discrete State Space Models for Irregularly-Sampled
Time Series [18.885471782270375]
NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables.
We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference.
Empirical results on multiple benchmark datasets show improved imputation and forecasting performance of NCDSSM over existing models.
arXiv Detail & Related papers (2023-01-26T18:45:04Z) - Quantum Persistent Homology [0.9023847175654603]
Persistent homology is a powerful mathematical tool that summarizes useful information about the shape of data.
We develop an efficient quantum computation of persistent Betti numbers, which track topological features of data across different scales.
Our approach employs a persistent Dirac operator whose square yields the persistent Laplacian, and in turn the underlying persistent Betti numbers.
arXiv Detail & Related papers (2022-02-25T20:52:03Z) - Moment evolution equations and moment matching for stochastic image
EPDiff [68.97335984455059]
Models of image deformation allow study of time-continuous effects transforming images by deforming the image domain.
Applications include medical image analysis with both population trends and random subject specific variation.
We use moment approximations of the corresponding Ito diffusion to construct estimators for statistical inference in the parameters full model.
arXiv Detail & Related papers (2021-10-07T11:08:11Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z) - Variational Conditional Dependence Hidden Markov Models for
Skeleton-Based Action Recognition [7.9603223299524535]
This paper revisits conventional sequential modeling approaches, aiming to address the problem of capturing time-varying temporal dependency patterns.
We propose a different formulation of HMMs, whereby the dependence on past frames is dynamically inferred from the data.
We derive a tractable inference algorithm based on the forward-backward algorithm.
arXiv Detail & Related papers (2020-02-13T23:18:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.