SPROCKET: Extending ROCKET to Distance-Based Time-Series Transformations With Prototypes
- URL: http://arxiv.org/abs/2512.08246v1
- Date: Tue, 09 Dec 2025 05:00:01 GMT
- Title: SPROCKET: Extending ROCKET to Distance-Based Time-Series Transformations With Prototypes
- Authors: Nicholas Harner,
- Abstract summary: SPROCKET implements a new feature engineering strategy based on prototypes.<n>On a majority of the UCR and UEA Time Series Classification archives, SPROCKET achieves performance comparable to existing convolutional algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Classical Time Series Classification algorithms are dominated by feature engineering strategies. One of the most prominent of these transforms is ROCKET, which achieves strong performance through random kernel features. We introduce SPROCKET (Selected Prototype Random Convolutional Kernel Transform), which implements a new feature engineering strategy based on prototypes. On a majority of the UCR and UEA Time Series Classification archives, SPROCKET achieves performance comparable to existing convolutional algorithms and the new MR-HY-SP ( MultiROCKET-HYDRA-SPROCKET) ensemble's average accuracy ranking exceeds HYDRA-MR, the previous best convolutional ensemble's performance. These experimental results demonstrate that prototype-based feature transformation can enhance both accuracy and robustness in time series classification.
Related papers
- PRISM: Parallel Residual Iterative Sequence Model [52.26239951489612]
We propose PRISM (Parallel Residual Iterative Sequence Model) to resolve this tension.<n>PRISM introduces a solver-inspired inductive bias that captures key structural properties of multi-step refinement in a parallelizable form.<n>We prove that this formulation achieves Rank-$L$ accumulation, structurally expanding the update manifold beyond the single-step Rank-$1$ bottleneck.
arXiv Detail & Related papers (2026-02-11T12:39:41Z) - HIT-ROCKET: Hadamard-vector Inner-product Transformer for ROCKET [0.039089069256361735]
Time series classification holds broad application value in communications, information countermeasures, finance, and medicine.<n>State-of-the-art (SOTA) methods exhibit high computational complexity, coupled with lengthy parameter tuning and training cycles.<n>We propose a feature extraction approach based on Hadamard convolutional transform, utilizing column or row vectors of Hadamard matrices as convolution kernels with extended lengths of varying sizes.
arXiv Detail & Related papers (2025-11-03T13:39:40Z) - Time series classification with random convolution kernels: pooling operators and input representations matter [0.0]
This article presents a new approach based on MiniRocket, called SelF-Rocket, for fast time series classification (TSC)<n>It dynamically selects the best couple of input representations and pooling operator during the training process.<n>It achieves state-of-the-art accuracy on the University of California Riverside (UCR) benchmark datasets.
arXiv Detail & Related papers (2024-09-02T09:42:17Z) - PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - A Refreshed Similarity-based Upsampler for Direct High-Ratio Feature Upsampling [54.05517338122698]
A popular similarity-based feature upsampling pipeline has been proposed, which utilizes a high-resolution feature as guidance.<n>We propose an explicitly controllable query-key feature alignment from both semantic-aware and detail-aware perspectives.<n>We develop a fine-grained neighbor selection strategy on HR features, which is simple yet effective for alleviating mosaic artifacts.
arXiv Detail & Related papers (2024-07-02T14:12:21Z) - Spectraformer: A Unified Random Feature Framework for Transformer [6.539275954677546]
We introduce Spectraformer, a unified framework for approximating and learning the kernel function in the attention mechanism of the Transformer.<n>Our empirical results demonstrate, for the first time, that a random feature-based approach can achieve performance comparable to top-performing sparse and low-rank methods.
arXiv Detail & Related papers (2024-05-24T07:52:53Z) - POCKET: Pruning Random Convolution Kernels for Time Series Classification from a Feature Selection Perspective [8.359327841946852]
A time series classification model, POCKET, is designed to efficiently prune redundant kernels.
POCKET prunes up to 60% of kernels without a significant reduction in accuracy and performs 11$times$ faster than its counterparts.
Experimental results on diverse time series datasets show that POCKET prunes up to 60% of kernels without a significant reduction in accuracy and performs 11$times$ faster than its counterparts.
arXiv Detail & Related papers (2023-09-15T16:03:23Z) - Sorted Convolutional Network for Achieving Continuous Rotational
Invariance [56.42518353373004]
We propose a Sorting Convolution (SC) inspired by some hand-crafted features of texture images.
SC achieves continuous rotational invariance without requiring additional learnable parameters or data augmentation.
Our results demonstrate that SC achieves the best performance in the aforementioned tasks.
arXiv Detail & Related papers (2023-05-23T18:37:07Z) - Deciphering RNA Secondary Structure Prediction: A Probabilistic K-Rook Matching Perspective [63.3632827588974]
We introduce RFold, a method that learns to predict the most matching K-Rook solution from the given sequence.
RFold achieves competitive performance and about eight times faster inference efficiency than state-of-the-art approaches.
arXiv Detail & Related papers (2022-12-02T16:34:56Z) - Joint Spatial-Temporal and Appearance Modeling with Transformer for
Multiple Object Tracking [59.79252390626194]
We propose a novel solution named TransSTAM, which leverages Transformer to model both the appearance features of each object and the spatial-temporal relationships among objects.
The proposed method is evaluated on multiple public benchmarks including MOT16, MOT17, and MOT20, and it achieves a clear performance improvement in both IDF1 and HOTA.
arXiv Detail & Related papers (2022-05-31T01:19:18Z) - MultiRocket: Effective summary statistics for convolutional outputs in
time series classification [5.857382887020592]
We show that it is possible to significantly improve the accuracy of MiniRocket (and Rocket)
By expanding the set of features produced by the transform, we make MultiRocket (for MiniRocket with Multiple Features) the single most accurate method on the datasets in the UCR archive.
arXiv Detail & Related papers (2021-01-31T14:04:10Z) - MINIROCKET: A Very Fast (Almost) Deterministic Transform for Time Series
Classification [5.519586522442065]
ROCKET achieves state-of-the-art accuracy with a fraction of the computational expense of most existing methods.
We reformulate ROCKET into a new method, MINIROCKET, making it up to 75 times faster on larger datasets.
It is possible to train and test a classifier on all of 109 datasets from the UCR archive to state-of-the-art accuracy in less than 10 minutes.
arXiv Detail & Related papers (2020-12-16T08:24:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.