Gated Fusion Enhanced Multi-Scale Hierarchical Graph Convolutional Network for Stock Movement Prediction
- URL: http://arxiv.org/abs/2511.01570v1
- Date: Mon, 03 Nov 2025 13:39:23 GMT
- Title: Gated Fusion Enhanced Multi-Scale Hierarchical Graph Convolutional Network for Stock Movement Prediction
- Authors: Xiaosha Xue, Peibo Duan, Zhipeng Liu, Qi Chu, Changsheng Zhang, Bin zhang,
- Abstract summary: We introduce MS-HGFN (Multi-Scale Hierarchical Graph Fusion Network)<n>The model features a hierarchical GNN module that forms dynamic graphs by learning patterns from intra-attributes and features from inter-attributes over different time scales.<n> Experiments utilizing real-world datasets from U.S. and Chinese stock markets demonstrate that MS-HGFN outperforms both traditional and advanced models.
- Score: 12.806794559293289
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Accurately predicting stock market movements remains a formidable challenge due to the inherent volatility and complex interdependencies among stocks. Although multi-scale Graph Neural Networks (GNNs) hold potential for modeling these relationships, they frequently neglect two key points: the subtle intra-attribute patterns within each stock affecting inter-stock correlation, and the biased attention to coarse- and fine-grained features during multi-scale sampling. To overcome these challenges, we introduce MS-HGFN (Multi-Scale Hierarchical Graph Fusion Network). The model features a hierarchical GNN module that forms dynamic graphs by learning patterns from intra-attributes and features from inter-attributes over different time scales, thus comprehensively capturing spatio-temporal dependencies. Additionally, a top-down gating approach facilitates the integration of multi-scale spatio-temporal features, preserving critical coarse- and fine-grained features without too much interference. Experiments utilizing real-world datasets from U.S. and Chinese stock markets demonstrate that MS-HGFN outperforms both traditional and advanced models, yielding up to a 1.4% improvement in prediction accuracy and enhanced stability in return simulations. The code is available at https://anonymous.4open.science/r/MS-HGFN.
Related papers
- Improving Long-Range Interactions in Graph Neural Simulators via Hamiltonian Dynamics [71.53370807809296]
Recent Graph Neural Simulators (GNSs) accelerate simulations by learning dynamics on graph-structured data.<n>We propose Information-preserving Graph Neural Simulators (IGNS), a graph-based neural simulator built on the principles of Hamiltonian dynamics.<n>IGNS consistently outperforms state-of-the-art GNSs, achieving higher accuracy and stability under challenging and complex dynamical systems.
arXiv Detail & Related papers (2025-11-11T12:53:56Z) - MaGNet: A Mamba Dual-Hypergraph Network for Stock Prediction via Temporal-Causal and Global Relational Learning [3.2859360081297715]
This work introduces MaGNet, a novel Mamba dual-hyperGraph Network for stock prediction.<n>MaGNet integrates a MAGE block, Feature-wise and Stock-wise 2D Spatiotemporal Attention modules, and a dual hypergraph framework.<n>Experiments on six major stock indices demonstrate MaGNet outperforms state-of-the-art methods in both superior predictive performance and exceptional investment returns.
arXiv Detail & Related papers (2025-10-29T20:47:16Z) - Structure Over Signal: A Globalized Approach to Multi-relational GNNs for Stock Prediction [0.0]
We propose OmniGNN, an attention-based multi-relational dynamic GNN for macroeconomic shocks.<n>Central to OmniGNN is a sector node acting as a global intermediary, enabling rapid shock propagation across the graph.<n>Experiments show that OmniGNN outperforms existing stock prediction models on public datasets.
arXiv Detail & Related papers (2025-10-12T19:33:16Z) - ScaleGNN: Towards Scalable Graph Neural Networks via Adaptive High-order Neighboring Feature Fusion [73.85920403511706]
We propose ScaleGNN, a novel framework that adaptively fuses multi-hop node features for scalable and effective graph learning.<n>We show that ScaleGNN consistently outperforms state-of-the-art GNNs in both predictive accuracy and computational efficiency.
arXiv Detail & Related papers (2025-04-22T14:05:11Z) - Spatiotemporal Graph Learning with Direct Volumetric Information Passing and Feature Enhancement [62.91536661584656]
We propose a dual-module framework, Cell-embedded and Feature-enhanced Graph Neural Network (aka, CeFeGNN) for learning.<n>We embed learnable cell attributions to the common node-edge message passing process, which better captures the spatial dependency of regional features.<n>Experiments on various PDE systems and one real-world dataset demonstrate that CeFeGNN achieves superior performance compared with other baselines.
arXiv Detail & Related papers (2024-09-26T16:22:08Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Efficient Integration of Multi-Order Dynamics and Internal Dynamics in
Stock Movement Prediction [20.879245331384794]
Recent deep neural network (DNN) methods capture multi-order dynamics using hypergraphs, but rely on the Fourier basis in the convolution.
We propose a framework for stock movement prediction to overcome the above issues.
Our framework outperforms state-of-the-art methods in terms of profit and stability.
arXiv Detail & Related papers (2022-11-11T01:58:18Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Long-term Spatio-temporal Forecasting via Dynamic Multiple-Graph
Attention [20.52864145999387]
Long-term tensor-temporal forecasting (LSTF) makes use of long-term dependency between spatial and temporal domains, contextual information, and inherent pattern in the data.
We propose new graph models to represent the contextual information of each node and the long-term parking revealed-temporal data dependency structure.
Our proposed approaches significantly improve the performance of existing graph neural network models in LSTF prediction tasks.
arXiv Detail & Related papers (2022-04-23T06:51:37Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.