COTN: A Chaotic Oscillatory Transformer Network for Complex Volatile Systems under Extreme Conditions
- URL: http://arxiv.org/abs/2511.06273v1
- Date: Sun, 09 Nov 2025 08:17:19 GMT
- Title: COTN: A Chaotic Oscillatory Transformer Network for Complex Volatile Systems under Extreme Conditions
- Authors: Boyan Tang, Yilong Zeng, Xuanhao Ren, Peng Xiao, Yuhan Zhao, Raymond Lee, Jianghua Wu,
- Abstract summary: Accurate prediction of financial and electricity markets, especially under extreme conditions, remains a significant challenge.<n>We propose the Chaoticy Transformer Network (COTN), which combines a Transformer architecture with a novel Lee activation function.<n>COTN incorporates an Autoencoder Self-Regressive (ASR) module to detect and isolate abnormal market patterns.<n>Our approach outperforms state-of-the-art deep learning models like Informer by up to 17% and traditional statistical methods like GARCH by as much as 17%.
- Score: 4.606846373731374
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Accurate prediction of financial and electricity markets, especially under extreme conditions, remains a significant challenge due to their intrinsic nonlinearity, rapid fluctuations, and chaotic patterns. To address these limitations, we propose the Chaotic Oscillatory Transformer Network (COTN). COTN innovatively combines a Transformer architecture with a novel Lee Oscillator activation function, processed through Max-over-Time pooling and a lambda-gating mechanism. This design is specifically tailored to effectively capture chaotic dynamics and improve responsiveness during periods of heightened volatility, where conventional activation functions (e.g., ReLU, GELU) tend to saturate. Furthermore, COTN incorporates an Autoencoder Self-Regressive (ASR) module to detect and isolate abnormal market patterns, such as sudden price spikes or crashes, thereby preventing corruption of the core prediction process and enhancing robustness. Extensive experiments across electricity spot markets and financial markets demonstrate the practical applicability and resilience of COTN. Our approach outperforms state-of-the-art deep learning models like Informer by up to 17% and traditional statistical methods like GARCH by as much as 40%. These results underscore COTN's effectiveness in navigating real-world market uncertainty and complexity, offering a powerful tool for forecasting highly volatile systems under duress.
Related papers
- A Hybrid Autoencoder-Transformer Model for Robust Day-Ahead Electricity Price Forecasting under Extreme Conditions [2.2360725546624267]
This paper proposes a novel hybrid deep learning framework that integrates a Distilled Attention Transformer model and an Autoencoder Self-regression Model.<n>Experiments on datasets sampled from California and Shandong Province demonstrate that our framework significantly outperforms state-of-the-art methods in prediction accuracy, robustness, and computational efficiency.
arXiv Detail & Related papers (2025-11-10T09:47:24Z) - Encoder Decoder Generative Adversarial Network Model for Stock Market Prediction [0.0]
We propose a GRU-based expressive-Decoder GAN (EDGAN) model that strikes a balance between power and simplicity.<n>Experiments on diverse stock datasets demonstrate that EDGAN achieves superior forecasting accuracy and training stability, even in volatile markets.
arXiv Detail & Related papers (2025-10-12T13:57:36Z) - PowerGrow: Feasible Co-Growth of Structures and Dynamics for Power Grid Synthesis [75.14189839277928]
We present PowerGrow, a co-generative framework that significantly reduces computational overhead while maintaining operational validity.<n> Experiments across benchmark settings show that PowerGrow outperforms prior diffusion models in fidelity and diversity.<n>This demonstrates its ability to generate operationally valid and realistic power grid scenarios.
arXiv Detail & Related papers (2025-08-29T01:47:27Z) - TensorHyper-VQC: A Tensor-Train-Guided Hypernetwork for Robust and Scalable Variational Quantum Computing [50.95799256262098]
We introduceHyper-VQC, a novel tensor-train (TT)-guided hypernetwork framework for quantum machine learning.<n>Our framework delegates the generation of quantum circuit parameters to a classical TT network, effectively decoupling optimization from quantum hardware.<n>These results positionHyper-VQC as a scalable and noise-resilient framework for advancing practical quantum machine learning on near-term devices.
arXiv Detail & Related papers (2025-08-01T23:37:55Z) - Optimizing Multi-Tier Supply Chain Ordering with LNN+XGBoost: Mitigating the Bullwhip Effect [0.0]
This study introduces a hybrid LNN and XGBoost model to optimize ordering strategies in multi-tier supply chains.<n>By leveraging LNN's dynamic feature extraction and XGBoost's global optimization capabilities, the model aims to mitigate the bullwhip effect and enhance cumulative profitability.
arXiv Detail & Related papers (2025-07-28T23:24:54Z) - Spark Transformer: Reactivating Sparsity in FFN and Attention [53.221448818147024]
We introduce Spark Transformer, a novel architecture that achieves a high level of activation sparsity in both FFN and the attention mechanism.<n>This sparsity translates to a 2.5x reduction in FLOPs, leading to decoding wall-time speedups of up to 1.79x on CPU and 1.40x on GPU.
arXiv Detail & Related papers (2025-06-07T03:51:13Z) - Communication-Efficient Federated Learning by Quantized Variance Reduction for Heterogeneous Wireless Edge Networks [55.467288506826755]
Federated learning (FL) has been recognized as a viable solution for local-privacy-aware collaborative model training in wireless edge networks.<n>Most existing communication-efficient FL algorithms fail to reduce the significant inter-device variance.<n>We propose a novel communication-efficient FL algorithm, named FedQVR, which relies on a sophisticated variance-reduced scheme.
arXiv Detail & Related papers (2025-01-20T04:26:21Z) - Towards Resource-Efficient Federated Learning in Industrial IoT for Multivariate Time Series Analysis [50.18156030818883]
Anomaly and missing data constitute a thorny problem in industrial applications.
Deep learning enabled anomaly detection has emerged as a critical direction.
The data collected in edge devices contain user privacy.
arXiv Detail & Related papers (2024-11-06T15:38:31Z) - Retentive Neural Quantum States: Efficient Ansätze for Ab Initio Quantum Chemistry [10.423935999935315]
We explore the use of the retentive network (RetNet) as an ansatz for solving electronic ground state problems in quantum chemistry.
We show that RetNet overcomes this time complexity bottleneck by processing data in parallel during training, and recurrently during inference.
arXiv Detail & Related papers (2024-11-06T13:24:34Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Addressing Tactic Volatility in Self-Adaptive Systems Using Evolved
Recurrent Neural Networks and Uncertainty Reduction Tactics [6.942025710859187]
Self-adaptive systems frequently use tactics to perform adaptations.
Tactic volatility occurs in real-world systems and is defined as variable behavior in the attributes of a tactic.
We propose a Tactic Volatility Aware (TVA-E) process utilizing evolved Recurrent Neural Networks (eRNN) to provide accurate tactic predictions.
arXiv Detail & Related papers (2022-04-21T17:47:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.