Dynamic Multi-Network Mining of Tensor Time Series
- URL: http://arxiv.org/abs/2402.11773v2
- Date: Thu, 22 Feb 2024 01:17:29 GMT
- Title: Dynamic Multi-Network Mining of Tensor Time Series
- Authors: Kohei Obata, Koki Kawabata, Yasuko Matsubara, Yasushi Sakurai
- Abstract summary: Subsequence clustering of time series is an essential task in data mining.
We present a new method, Dynamic Multinetwork time series clustering (DMM)
Our method outperforms the state-of-the-art methods in terms of clustering accuracy.
- Score: 8.59982222642104
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Subsequence clustering of time series is an essential task in data mining,
and interpreting the resulting clusters is also crucial since we generally do
not have prior knowledge of the data. Thus, given a large collection of tensor
time series consisting of multiple modes, including timestamps, how can we
achieve subsequence clustering for tensor time series and provide interpretable
insights? In this paper, we propose a new method, Dynamic Multi-network Mining
(DMM), that converts a tensor time series into a set of segment groups of
various lengths (i.e., clusters) characterized by a dependency network
constrained with l1-norm. Our method has the following properties. (a)
Interpretable: it characterizes the cluster with multiple networks, each of
which is a sparse dependency network of a corresponding non-temporal mode, and
thus provides visible and interpretable insights into the key relationships.
(b) Accurate: it discovers the clusters with distinct networks from tensor time
series according to the minimum description length (MDL). (c) Scalable: it
scales linearly in terms of the input data size when solving a non-convex
problem to optimize the number of segments and clusters, and thus it is
applicable to long-range and high-dimensional tensors. Extensive experiments
with synthetic datasets confirm that our method outperforms the
state-of-the-art methods in terms of clustering accuracy. We then use real
datasets to demonstrate that DMM is useful for providing interpretable insights
from tensor time series.
Related papers
- Correlating Time Series with Interpretable Convolutional Kernels [18.77493756204539]
This study addresses the problem of convolutional kernel learning in time series data.
We use tensor computations to reformulate the convolutional kernel learning problem in the form of tensors.
This study lays an insightful foundation for automatically learning convolutional kernels from time series data.
arXiv Detail & Related papers (2024-09-02T16:29:21Z) - MTS2Graph: Interpretable Multivariate Time Series Classification with
Temporal Evolving Graphs [1.1756822700775666]
We introduce a new framework for interpreting time series data by extracting and clustering the input representative patterns.
We run experiments on eight datasets of the UCR/UEA archive, along with HAR and PAM datasets.
arXiv Detail & Related papers (2023-06-06T16:24:27Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Learnable Dynamic Temporal Pooling for Time Series Classification [22.931314501371805]
We present a dynamic temporal pooling (DTP) technique that reduces the temporal size of hidden representations by aggregating the features at the segment-level.
For the partition of a whole series into multiple segments, we utilize dynamic time warping (DTW) to align each time point in a temporal order with the prototypical features of the segments.
The DTP layer combined with a fully-connected layer helps to extract further discriminative features considering their temporal position within an input time series.
arXiv Detail & Related papers (2021-04-02T08:58:44Z) - Graph Gamma Process Generalized Linear Dynamical Systems [60.467040479276704]
We introduce graph gamma process (GGP) linear dynamical systems to model real multivariate time series.
For temporal pattern discovery, the latent representation under the model is used to decompose the time series into a parsimonious set of multivariate sub-sequences.
We use the generated random graph, whose number of nonzero-degree nodes is finite, to define both the sparsity pattern and dimension of the latent state transition matrix.
arXiv Detail & Related papers (2020-07-25T04:16:34Z) - Supervised Feature Subset Selection and Feature Ranking for Multivariate
Time Series without Feature Extraction [78.84356269545157]
We introduce supervised feature ranking and feature subset selection algorithms for MTS classification.
Unlike most existing supervised/unsupervised feature selection algorithms for MTS our techniques do not require a feature extraction step to generate a one-dimensional feature vector from the time series.
arXiv Detail & Related papers (2020-05-01T07:46:29Z) - Data Curves Clustering Using Common Patterns Detection [0.0]
Analyzing and clustering time series, or in general any kind of curves, could be critical for several human activities.
New Curves Clustering Using Common Patterns (3CP) methodology is introduced.
arXiv Detail & Related papers (2020-01-05T18:36:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.