Encoder Decoder Generative Adversarial Network Model for Stock Market Prediction
- URL: http://arxiv.org/abs/2510.10617v1
- Date: Sun, 12 Oct 2025 13:57:36 GMT
- Title: Encoder Decoder Generative Adversarial Network Model for Stock Market Prediction
- Authors: Bahadur Yadav, Sanjay Kumar Mohanty,
- Abstract summary: We propose a GRU-based expressive-Decoder GAN (EDGAN) model that strikes a balance between power and simplicity.<n>Experiments on diverse stock datasets demonstrate that EDGAN achieves superior forecasting accuracy and training stability, even in volatile markets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Forecasting stock prices remains challenging due to the volatile and non-linear nature of financial markets. Despite the promise of deep learning, issues such as mode collapse, unstable training, and difficulty in capturing temporal and feature level correlations have limited the applications of GANs in this domain. We propose a GRU-based Encoder-Decoder GAN (EDGAN) model that strikes a balance between expressive power and simplicity. The model introduces key innovations such as a temporal decoder with residual connections for precise reconstruction, conditioning on static and dynamic covariates for contextual learning, and a windowing mechanism to capture temporal dynamics. Here, the generator uses a dense encoder-decoder framework with residual GRU blocks. Extensive experiments on diverse stock datasets demonstrate that EDGAN achieves superior forecasting accuracy and training stability, even in volatile markets. It consistently outperforms traditional GAN variants in forecasting accuracy and convergence stability under market conditions.
Related papers
- StockBot 2.0: Vanilla LSTMs Outperform Transformer-based Forecasting for Stock Prices [0.0]
We present an enhanced StockBot architecture that systematically evaluates modern attention-based, convolutional, and recurrent time-series forecasting models.<n>A carefully constructed vanilla LSTM consistently achieves superior predictive accuracy and more stable buy/sell decision-making.
arXiv Detail & Related papers (2026-01-01T04:09:51Z) - ASTIF: Adaptive Semantic-Temporal Integration for Cryptocurrency Price Forecasting [6.12055122337183]
ASTIF is a hybrid intelligent system that adapts its forecasting strategy in real time through confidence-based meta-learning.<n>A confidence-aware meta-learner functions as an adaptive inference layer, modulating each predictor's contribution based on its real-time uncertainty.<n>The research contributes a scalable, knowledge-based solution for fusing quantitative and qualitative data in non-stationary environments.
arXiv Detail & Related papers (2025-12-21T09:17:36Z) - COTN: A Chaotic Oscillatory Transformer Network for Complex Volatile Systems under Extreme Conditions [4.606846373731374]
Accurate prediction of financial and electricity markets, especially under extreme conditions, remains a significant challenge.<n>We propose the Chaoticy Transformer Network (COTN), which combines a Transformer architecture with a novel Lee activation function.<n>COTN incorporates an Autoencoder Self-Regressive (ASR) module to detect and isolate abnormal market patterns.<n>Our approach outperforms state-of-the-art deep learning models like Informer by up to 17% and traditional statistical methods like GARCH by as much as 17%.
arXiv Detail & Related papers (2025-11-09T08:17:19Z) - GARCH-Informed Neural Networks for Volatility Prediction in Financial Markets [0.0]
We present a new, hybrid Deep Learning model that captures and forecasting market volatility more accurately than either class of models are capable of on their own.
When compared to other time series models, GINN showed superior out-of-sample prediction performance in terms of the Coefficient of Determination ($R2$), Mean Squared Error (MSE), and Mean Absolute Error (MAE)
arXiv Detail & Related papers (2024-09-30T23:53:54Z) - Probabilistic Forecasting of Real-Time Electricity Market Signals via Interpretable Generative AI [41.99446024585741]
We present WIAE-GPF, a Weak Innovation AutoEncoder-based Generative Probabilistic Forecasting architecture.
A novel learning algorithm with structural convergence guarantees is proposed, ensuring that the generated forecast samples match the ground truth conditional probability distribution.
arXiv Detail & Related papers (2024-03-09T00:41:30Z) - Stockformer: A Price-Volume Factor Stock Selection Model Based on Wavelet Transform and Multi-Task Self-Attention Networks [3.7608255115473592]
This paper introduces Stockformer, a price-volume factor stock selection model that integrates wavelet transformation and a multitask self-attention network.
Stockformer decomposes stock returns into high and low frequencies, meticulously capturing long-term market trends and abrupt events.
Experimental results show that Stockformer outperforms existing advanced methods on multiple real stock market datasets.
arXiv Detail & Related papers (2023-11-23T04:33:47Z) - Diffusion Variational Autoencoder for Tackling Stochasticity in
Multi-Step Regression Stock Price Prediction [54.21695754082441]
Multi-step stock price prediction over a long-term horizon is crucial for forecasting its volatility.
Current solutions to multi-step stock price prediction are mostly designed for single-step, classification-based predictions.
We combine a deep hierarchical variational-autoencoder (VAE) and diffusion probabilistic techniques to do seq2seq stock prediction.
Our model is shown to outperform state-of-the-art solutions in terms of its prediction accuracy and variance.
arXiv Detail & Related papers (2023-08-18T16:21:15Z) - Learning to Predict Short-Term Volatility with Order Flow Image Representation [0.0]
The paper addresses the challenging problem of predicting the short-term realized volatility of the Bitcoin price using order flow information.
We propose a method that transforms order flow data over a fixed time interval (snapshots) into images.
Images are then used to train both a simple 3-layer Convolutional Neural Network (CNN) and more advanced ResNet-18 and ConvMixer.
arXiv Detail & Related papers (2023-04-04T12:32:25Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Mitigating Data Redundancy to Revitalize Transformer-based Long-Term Time Series Forecasting System [46.39662315849883]
We introduce CLMFormer, a novel framework that mitigates redundancy through curriculum learning and a memory-driven decoder.<n>CLMFormer consistently improves Transformer-based models by up to 30%, demonstrating its effectiveness in long-horizon forecasting.
arXiv Detail & Related papers (2022-07-16T04:05:15Z) - Robustness and Accuracy Could Be Reconcilable by (Proper) Definition [109.62614226793833]
The trade-off between robustness and accuracy has been widely studied in the adversarial literature.
We find that it may stem from the improperly defined robust error, which imposes an inductive bias of local invariance.
By definition, SCORE facilitates the reconciliation between robustness and accuracy, while still handling the worst-case uncertainty.
arXiv Detail & Related papers (2022-02-21T10:36:09Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.