DGDNN: Decoupled Graph Diffusion Neural Network for Stock Movement
Prediction
- URL: http://arxiv.org/abs/2401.01846v1
- Date: Wed, 3 Jan 2024 17:36:27 GMT
- Title: DGDNN: Decoupled Graph Diffusion Neural Network for Stock Movement
Prediction
- Authors: Zinuo You, Zijian Shi, Hongbo Bo, John Cartlidge, Li Zhang, Yan Ge
- Abstract summary: We propose a novel graph learning approach implemented without expert knowledge to address these issues.
First, our approach automatically constructs dynamic stock graphs by entropy-driven edge generation from a signal processing perspective.
Last, a decoupled representation learning scheme is adopted to capture distinctive hierarchical intra-stock features.
- Score: 8.7861010791349
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Forecasting future stock trends remains challenging for academia and industry
due to stochastic inter-stock dynamics and hierarchical intra-stock dynamics
influencing stock prices. In recent years, graph neural networks have achieved
remarkable performance in this problem by formulating multiple stocks as
graph-structured data. However, most of these approaches rely on artificially
defined factors to construct static stock graphs, which fail to capture the
intrinsic interdependencies between stocks that rapidly evolve. In addition,
these methods often ignore the hierarchical features of the stocks and lose
distinctive information within. In this work, we propose a novel graph learning
approach implemented without expert knowledge to address these issues. First,
our approach automatically constructs dynamic stock graphs by entropy-driven
edge generation from a signal processing perspective. Then, we further learn
task-optimal dependencies between stocks via a generalized graph diffusion
process on constructed stock graphs. Last, a decoupled representation learning
scheme is adopted to capture distinctive hierarchical intra-stock features.
Experimental results demonstrate substantial improvements over state-of-the-art
baselines on real-world datasets. Moreover, the ablation study and sensitivity
study further illustrate the effectiveness of the proposed method in modeling
the time-evolving inter-stock and intra-stock dynamics.
Related papers
- Evaluating Financial Relational Graphs: Interpretation Before Prediction [4.421486904657393]
We introduce the SPNews dataset, collected based on S&P 500 Index stocks, to facilitate the construction of dynamic relationship graphs.
By using the relationship graph to explain historical financial phenomena, we assess its validity before constructing a graph neural network.
Our evaluation methods can effectively differentiate between various financial relationship graphs, yielding more interpretable results.
arXiv Detail & Related papers (2024-09-28T22:43:00Z) - Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Graph Learning under Distribution Shifts: A Comprehensive Survey on
Domain Adaptation, Out-of-distribution, and Continual Learning [53.81365215811222]
We provide a review and summary of the latest approaches, strategies, and insights that address distribution shifts within the context of graph learning.
We categorize existing graph learning methods into several essential scenarios, including graph domain adaptation learning, graph out-of-distribution learning, and graph continual learning.
We discuss the potential applications and future directions for graph learning under distribution shifts with a systematic analysis of the current state in this field.
arXiv Detail & Related papers (2024-02-26T07:52:40Z) - Multi-relational Graph Diffusion Neural Network with Parallel Retention
for Stock Trends Classification [6.383640665055313]
We propose a graph-based representation learning approach aimed at predicting future movements of multiple stocks.
Our approach consistently outperforms state-of-the-art baselines in forecasting next trading day stock trends across three test periods spanning seven years.
arXiv Detail & Related papers (2024-01-05T17:15:45Z) - Signed Graph Neural Ordinary Differential Equation for Modeling
Continuous-time Dynamics [13.912268915939656]
The prevailing approach of integrating graph neural networks with ordinary differential equations has demonstrated promising performance.
We introduce a novel approach: a signed graph neural ordinary differential equation, adeptly addressing the limitations of miscapturing signed information.
Our proposed solution boasts both flexibility and efficiency.
arXiv Detail & Related papers (2023-12-18T13:45:33Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Handling Distribution Shifts on Graphs: An Invariance Perspective [78.31180235269035]
We formulate the OOD problem on graphs and develop a new invariant learning approach, Explore-to-Extrapolate Risk Minimization (EERM)
EERM resorts to multiple context explorers that are adversarially trained to maximize the variance of risks from multiple virtual environments.
We prove the validity of our method by theoretically showing its guarantee of a valid OOD solution.
arXiv Detail & Related papers (2022-02-05T02:31:01Z) - Temporal-Relational Hypergraph Tri-Attention Networks for Stock Trend
Prediction [45.74513775015998]
We present a collaborative temporal-relational modeling framework for end-to-end stock trend prediction.
A novel hypergraph tri-attention network (HGTAN) is proposed to augment the hypergraph convolutional networks.
In this manner, HGTAN adaptively determines the importance of nodes, hyperedges, and hypergraphs during the information propagation among stocks.
arXiv Detail & Related papers (2021-07-22T02:16:09Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.