Persistent Homology of Coarse Grained State Space Networks
- URL: http://arxiv.org/abs/2206.02530v2
- Date: Fri, 4 Aug 2023 18:40:53 GMT
- Title: Persistent Homology of Coarse Grained State Space Networks
- Authors: Audun D. Myers, Max M. Chumley, Firas A. Khasawneh, Elizabeth Munch
- Abstract summary: We use persistent homology from topological data analysis to study the structure of complex transitional networks.
We show that the CGSSN captures rich information about the dynamic state of the underlying dynamical system.
- Score: 1.7434507809930746
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work is dedicated to the topological analysis of complex transitional
networks for dynamic state detection. Transitional networks are formed from
time series data and they leverage graph theory tools to reveal information
about the underlying dynamic system. However, traditional tools can fail to
summarize the complex topology present in such graphs. In this work, we
leverage persistent homology from topological data analysis to study the
structure of these networks. We contrast dynamic state detection from time
series using a coarse-grained state-space network (CGSSN) and topological data
analysis (TDA) to two state of the art approaches: ordinal partition networks
(OPNs) combined with TDA and the standard application of persistent homology to
the time-delay embedding of the signal. We show that the CGSSN captures rich
information about the dynamic state of the underlying dynamical system as
evidenced by a significant improvement in dynamic state detection and noise
robustness in comparison to OPNs. We also show that because the computational
time of CGSSN is not linearly dependent on the signal's length, it is more
computationally efficient than applying TDA to the time-delay embedding of the
time series.
Related papers
- DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - Higher-order Spatio-temporal Physics-incorporated Graph Neural Network for Multivariate Time Series Imputation [9.450743095412896]
Missing values is an essential but challenging issue due to the complex latent-temporal correlation and dynamic nature of time series.
We propose a higher-ordertemporal physics-incorporated Graph Neural Networks (HSPGNN) to address this problem.
HSPGNN provides better dynamic analysis and explanation than traditional data-driven models.
arXiv Detail & Related papers (2024-05-16T16:35:43Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - A Network Classification Method based on Density Time Evolution Patterns
Extracted from Network Automata [0.0]
We propose alternate sources of information to use as descriptor for the classification, which we denominate as density time-evolution pattern (D-TEP) and state density time-evolution pattern (SD-TEP)
Our results show a significant improvement compared to previous studies at five synthetic network databases and also seven real world databases.
arXiv Detail & Related papers (2022-11-18T15:27:26Z) - Space-Time Graph Neural Networks with Stochastic Graph Perturbations [100.31591011966603]
Space-time graph neural networks (ST-GNNs) learn efficient graph representations of time-varying data.
In this paper we revisit the properties of ST-GNNs and prove that they are stable to graph stabilitys.
Our analysis suggests that ST-GNNs are suitable for transfer learning on time-varying graphs.
arXiv Detail & Related papers (2022-10-28T16:59:51Z) - Bayesian Inference of Stochastic Dynamical Networks [0.0]
This paper presents a novel method for learning network topology and internal dynamics.
It is compared with group sparse Bayesian learning (GSBL), BINGO, kernel-based methods, dynGENIE3, GENIE3 and ARNI.
Our method achieves state-of-the-art performance compared with group sparse Bayesian learning (GSBL), BINGO, kernel-based methods, dynGENIE3, GENIE3 and ARNI.
arXiv Detail & Related papers (2022-06-02T03:22:34Z) - Topological Signal Processing using the Weighted Ordinal Partition
Network [1.9594639581421422]
topological data analysis (TDA) encodes information about the shape and structure of data.
The idea of utilizing tools from TDA for signal processing tasks, known as topological signal processing (TSP), has gained much attention in recent years.
In this paper, we take the next step: building a pipeline to analyze the weighted OPN with TDA.
arXiv Detail & Related papers (2022-04-27T18:01:18Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Z-GCNETs: Time Zigzags at Graph Convolutional Networks for Time Series
Forecasting [3.9195417834390907]
We introduce the concept of zigzag persistence into time-aware graph convolutional networks (GCNs)
We develop a new topological summary, zigzag persistence image, and derive its theoretical stability guarantees.
Our results indicate that Z-GCNET outperforms 13 state-of-the-art methods on 4 time series datasets.
arXiv Detail & Related papers (2021-05-10T04:01:04Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.