Evo-TFS: Evolutionary Time-Frequency Domain-Based Synthetic Minority Oversampling Approach to Imbalanced Time Series Classification
- URL: http://arxiv.org/abs/2601.01150v1
- Date: Sat, 03 Jan 2026 10:38:17 GMT
- Title: Evo-TFS: Evolutionary Time-Frequency Domain-Based Synthetic Minority Oversampling Approach to Imbalanced Time Series Classification
- Authors: Wenbin Pei, Ruohao Dai, Bing Xue, Mengjie Zhang, Qiang Zhang, Yiu-Ming Cheung,
- Abstract summary: Evo-TFS is a novel evolutionary oversampling method that integrates both time- and frequency-domain characteristics.<n>In Evo-TFS, strongly typed genetic programming is employed to evolve diverse, high-quality time series.<n>Experiments conducted on imbalanced time series datasets demonstrate that Evo-TFS outperforms existing oversampling methods.
- Score: 49.399484340024316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series classification is a fundamental machine learning task with broad real-world applications. Although many deep learning methods have proven effective in learning time-series data for classification, they were originally developed under the assumption of balanced data distributions. Once data distribution is uneven, these methods tend to ignore the minority class that is typically of higher practical significance. Oversampling methods have been designed to address this by generating minority-class samples, but their reliance on linear interpolation often hampers the preservation of temporal dynamics and the generation of diverse samples. Therefore, in this paper, we propose Evo-TFS, a novel evolutionary oversampling method that integrates both time- and frequency-domain characteristics. In Evo-TFS, strongly typed genetic programming is employed to evolve diverse, high-quality time series, guided by a fitness function that incorporates both time-domain and frequency-domain characteristics. Experiments conducted on imbalanced time series datasets demonstrate that Evo-TFS outperforms existing oversampling methods, significantly enhancing the performance of time-domain and frequency-domain classifiers.
Related papers
- FusAD: Time-Frequency Fusion with Adaptive Denoising for General Time Series Analysis [92.23551599659186]
Time series analysis plays a vital role in fields such as finance, healthcare, industry, and meteorology.<n>FusAD is a unified analysis framework designed for diverse time series tasks.
arXiv Detail & Related papers (2025-12-16T04:34:27Z) - UniDiff: A Unified Diffusion Framework for Multimodal Time Series Forecasting [90.47915032778366]
We propose UniDiff, a unified diffusion framework for multimodal time series forecasting.<n>At its core lies a unified and parallel fusion module, where a single cross-attention mechanism integrates structural information from timestamps and semantic context from texts.<n>Experiments on real-world benchmark datasets across eight domains demonstrate that the proposed UniDiff model achieves state-of-the-art performance.
arXiv Detail & Related papers (2025-12-08T05:36:14Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - FreRA: A Frequency-Refined Augmentation for Contrastive Learning on Time Series Classification [56.925103708982164]
We present a novel perspective from the frequency domain and identify three advantages for downstream classification: global, independent, and compact.<n>We propose the lightweight yet effective Frequency Refined Augmentation (FreRA) tailored for time series contrastive learning on classification tasks.<n>FreRA consistently outperforms ten leading baselines on time series classification, anomaly detection, and transfer learning tasks.
arXiv Detail & Related papers (2025-05-29T07:18:28Z) - Bridging Distribution Gaps in Time Series Foundation Model Pretraining with Prototype-Guided Normalization [29.082583523943157]
We propose a domain-aware adaptive normalization strategy within the Transformer architecture.<n>We replace the traditional LayerNorm with a prototype-guided dynamic normalization mechanism (ProtoNorm)<n>Our method significantly outperforms conventional pretraining techniques across both classification and forecasting tasks.
arXiv Detail & Related papers (2025-04-15T06:23:00Z) - VSFormer: Value and Shape-Aware Transformer with Prior-Enhanced Self-Attention for Multivariate Time Series Classification [47.92529531621406]
We propose a novel method, VSFormer, that incorporates both discriminative patterns (shape) and numerical information (value)<n>In addition, we extract class-specific prior information derived from supervised information to enrich the positional encoding.<n>Extensive experiments on all 30 UEA archived datasets demonstrate the superior performance of our method compared to SOTA models.
arXiv Detail & Related papers (2024-12-21T07:31:22Z) - CLeaRForecast: Contrastive Learning of High-Purity Representations for
Time Series Forecasting [2.5816901096123863]
Time series forecasting (TSF) holds significant importance in modern society, spanning numerous domains.
Previous representation learning-based TSF algorithms typically embrace a contrastive learning paradigm featuring segregated trend-periodicity representations.
We propose CLeaRForecast, a novel contrastive learning framework to learn high-purity time series representations with proposed sample, feature, and architecture purifying methods.
arXiv Detail & Related papers (2023-12-10T04:37:43Z) - Towards Diverse and Coherent Augmentation for Time-Series Forecasting [22.213927377926804]
Time-series data augmentations mitigate the issue of insufficient training data for deep learning models.
We propose to combine Spectral and Time Augmentation for generating more diverse and coherent samples.
Experiments on five real-world time-series datasets demonstrate that STAug outperforms the base models without data augmentation.
arXiv Detail & Related papers (2023-03-24T19:40:34Z) - Neural Ordinary Differential Equation Model for Evolutionary Subspace
Clustering and Its Applications [36.700813256689656]
We propose a neural ODE model for evolutionary subspace clustering to overcome this limitation.
We demonstrate that this method can not only interpolate data at any time step for the evolutionary subspace clustering task, but also achieve higher accuracy than other state-of-the-art methods.
arXiv Detail & Related papers (2021-07-22T07:02:03Z) - Towards Synthetic Multivariate Time Series Generation for Flare
Forecasting [5.098461305284216]
One of the limiting factors in training data-driven, rare-event prediction algorithms is the scarcity of the events of interest.
In this study, we explore the usefulness of the conditional generative adversarial network (CGAN) as a means to perform data-informed oversampling.
arXiv Detail & Related papers (2021-05-16T22:23:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.