Generating Hierarchical Structures for Improved Time Series
Classification Using Stochastic Splitting Functions
- URL: http://arxiv.org/abs/2309.11963v1
- Date: Thu, 21 Sep 2023 10:34:50 GMT
- Title: Generating Hierarchical Structures for Improved Time Series
Classification Using Stochastic Splitting Functions
- Authors: Celal Alagoz
- Abstract summary: This study introduces a novel hierarchical divisive clustering approach with splitting functions (SSFs) to enhance classification performance in multi-class datasets through hierarchical classification (HC)
The method has the unique capability of generating hierarchy without requiring explicit information, making it suitable for datasets lacking prior knowledge of hierarchy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This study introduces a novel hierarchical divisive clustering approach with
stochastic splitting functions (SSFs) to enhance classification performance in
multi-class datasets through hierarchical classification (HC). The method has
the unique capability of generating hierarchy without requiring explicit
information, making it suitable for datasets lacking prior knowledge of
hierarchy. By systematically dividing classes into two subsets based on their
discriminability according to the classifier, the proposed approach constructs
a binary tree representation of hierarchical classes. The approach is evaluated
on 46 multi-class time series datasets using popular classifiers (svm and
rocket) and SSFs (potr, srtr, and lsoo). The results reveal that the approach
significantly improves classification performance in approximately half and a
third of the datasets when using rocket and svm as the classifier,
respectively. The study also explores the relationship between dataset features
and HC performance. While the number of classes and flat classification (FC)
score show consistent significance, variations are observed with different
splitting functions. Overall, the proposed approach presents a promising
strategy for enhancing classification by generating hierarchical structure in
multi-class time series datasets. Future research directions involve exploring
different splitting functions, classifiers, and hierarchy structures, as well
as applying the approach to diverse domains beyond time series data. The source
code is made openly available to facilitate reproducibility and further
exploration of the method.
Related papers
- Convolutional autoencoder-based multimodal one-class classification [80.52334952912808]
One-class classification refers to approaches of learning using data from a single class only.
We propose a deep learning one-class classification method suitable for multimodal data.
arXiv Detail & Related papers (2023-09-25T12:31:18Z) - Hierarchical clustering with dot products recovers hidden tree structure [53.68551192799585]
In this paper we offer a new perspective on the well established agglomerative clustering algorithm, focusing on recovery of hierarchical structure.
We recommend a simple variant of the standard algorithm, in which clusters are merged by maximum average dot product and not, for example, by minimum distance or within-cluster variance.
We demonstrate that the tree output by this algorithm provides a bona fide estimate of generative hierarchical structure in data, under a generic probabilistic graphical model.
arXiv Detail & Related papers (2023-05-24T11:05:12Z) - MHCCL: Masked Hierarchical Cluster-Wise Contrastive Learning for
Multivariate Time Series [20.008535430484475]
Masked Hierarchical Cluster-wise Contrastive Learning model is presented.
It exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for time series.
It is shown to be superior to state-of-the-art approaches for unsupervised time series representation learning.
arXiv Detail & Related papers (2022-12-02T12:42:53Z) - Contrastive Multi-view Hyperbolic Hierarchical Clustering [33.050054725595736]
We propose Contrastive Multi-view Hyperbolic Hierarchical Clustering (CMHHC)
It consists of three components, i.e., multi-view alignment learning, aligned feature similarity learning, and continuous hyperbolic hierarchical clustering.
Experimental results on five real-world datasets demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-05-05T12:56:55Z) - Self-Supervised Class Incremental Learning [51.62542103481908]
Existing Class Incremental Learning (CIL) methods are based on a supervised classification framework sensitive to data labels.
When updating them based on the new class data, they suffer from catastrophic forgetting: the model cannot discern old class data clearly from the new.
In this paper, we explore the performance of Self-Supervised representation learning in Class Incremental Learning (SSCIL) for the first time.
arXiv Detail & Related papers (2021-11-18T06:58:19Z) - CAC: A Clustering Based Framework for Classification [20.372627144885158]
We design a simple, efficient, and generic framework called Classification Aware Clustering (CAC)
Our experiments on synthetic and real benchmark datasets demonstrate the efficacy of CAC over previous methods for combined clustering and classification.
arXiv Detail & Related papers (2021-02-23T18:59:39Z) - Inducing a hierarchy for multi-class classification problems [11.58041597483471]
In applications where categorical labels follow a natural hierarchy, classification methods that exploit the label structure often outperform those that do not.
In this paper, we investigate a class of methods that induce a hierarchy that can similarly improve classification performance over flat classifiers.
We demonstrate the effectiveness of the class of methods both for discovering a latent hierarchy and for improving accuracy in principled simulation settings and three real data applications.
arXiv Detail & Related papers (2021-02-20T05:40:42Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z) - Leveraging Class Hierarchies with Metric-Guided Prototype Learning [5.070542698701158]
In many classification tasks, the set of target classes can be organized into a hierarchy.
This structure induces a semantic distance between classes, and can be summarised under the form of a cost matrix.
We propose to model the hierarchical class structure by integrating this metric in the supervision of a prototypical network.
arXiv Detail & Related papers (2020-07-06T20:22:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.