Dynamic Graph Representation with Contrastive Learning for Financial Market Prediction: Integrating Temporal Evolution and Static Relations
- URL: http://arxiv.org/abs/2412.04034v1
- Date: Thu, 05 Dec 2024 10:15:56 GMT
- Title: Dynamic Graph Representation with Contrastive Learning for Financial Market Prediction: Integrating Temporal Evolution and Static Relations
- Authors: Yunhua Pei, Jin Zheng, John Cartlidge,
- Abstract summary: Temporal Graph Learning (TGL) is crucial for capturing the evolving nature of stock markets.<n>Traditional methods often ignore the interplay between dynamic temporal changes and static structures between stocks.<n>We propose the Dynamic Graph Representation with Contrastive Learning framework, which integrates dynamic and static graph relations.
- Score: 5.892066196730197
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal Graph Learning (TGL) is crucial for capturing the evolving nature of stock markets. Traditional methods often ignore the interplay between dynamic temporal changes and static relational structures between stocks. To address this issue, we propose the Dynamic Graph Representation with Contrastive Learning (DGRCL) framework, which integrates dynamic and static graph relations to improve the accuracy of stock trend prediction. Our framework introduces two key components: the Embedding Enhancement (EE) module and the Contrastive Constrained Training (CCT) module. The EE module focuses on dynamically capturing the temporal evolution of stock data, while the CCT module enforces static constraints based on stock relations, refined within contrastive learning. This dual-relation approach allows for a more comprehensive understanding of stock market dynamics. Our experiments on two major U.S. stock market datasets, NASDAQ and NYSE, demonstrate that DGRCL significantly outperforms state-of-the-art TGL baselines. Ablation studies indicate the importance of both modules. Overall, DGRCL not only enhances prediction ability but also provides a robust framework for integrating temporal and relational data in dynamic graphs. Code and data are available for public access.
Related papers
- Towards Improving Long-Tail Entity Predictions in Temporal Knowledge Graphs through Global Similarity and Weighted Sampling [53.11315884128402]
Temporal Knowledge Graph (TKG) completion models traditionally assume access to the entire graph during training.<n>We present an incremental training framework specifically designed for TKGs, aiming to address entities that are either not observed during training or have sparse connections.<n>Our approach combines a model-agnostic enhancement layer with a weighted sampling strategy, that can be augmented to and improve any existing TKG completion method.
arXiv Detail & Related papers (2025-07-25T06:02:48Z) - A Study of Dynamic Stock Relationship Modeling and S&P500 Price Forecasting Based on Differential Graph Transformer [4.6028394466086535]
We propose a Differential Graph Transformer framework for dynamic relationship modeling and price prediction.<n>Our DGT integrates sequential graph structure changes into multi-head self-attention.<n>Clausal temporal attention captures global/local dependencies in price sequences.
arXiv Detail & Related papers (2025-06-23T14:53:31Z) - Compile Scene Graphs with Reinforcement Learning [69.36723767339001]
Next-token prediction is the fundamental principle for training large language models (LLMs)<n>We introduce R1-SGG, a multimodal LLM (M-LLM) initially trained via supervised fine-tuning (SFT) on the scene graph dataset.<n>We design a set of graph-centric rewards, including three recall-based variants -- Hard Recall, Hard Recall+Relax, and Soft Recall.
arXiv Detail & Related papers (2025-04-18T10:46:22Z) - Integrate Temporal Graph Learning into LLM-based Temporal Knowledge Graph Model [48.15492235240126]
Temporal Knowledge Graph Forecasting aims to predict future events based on the observed events in history.
Existing methods have integrated retrieved historical facts or static graph representations into Large Language Models (LLMs)
We propose a novel framework TGL-LLM to integrate temporal graph learning into LLM-based temporal knowledge graph model.
arXiv Detail & Related papers (2025-01-21T06:12:49Z) - Relational Learning in Pre-Trained Models: A Theory from Hypergraph Recovery Perspective [60.64922606733441]
We introduce a mathematical model that formalizes relational learning as hypergraph recovery to study pre-training of Foundation Models (FMs)
In our framework, the world is represented as a hypergraph, with data abstracted as random samples from hyperedges. We theoretically examine the feasibility of a Pre-Trained Model (PTM) to recover this hypergraph and analyze the data efficiency in a minimax near-optimal style.
arXiv Detail & Related papers (2024-06-17T06:20:39Z) - MTRGL:Effective Temporal Correlation Discerning through Multi-modal
Temporal Relational Graph Learning [2.879827542529434]
We introduce a novel framework, Multi-modal Temporal Relation Graph Learning (MTRGL)
MTRGL combines time series data and discrete features into a temporal graph and employs a memory-based temporal graph neural network.
Our experiments on real-world datasets confirm the superior performance of MTRGL.
arXiv Detail & Related papers (2024-01-25T14:21:14Z) - MDGNN: Multi-Relational Dynamic Graph Neural Network for Comprehensive
and Dynamic Stock Investment Prediction [22.430266982219496]
Multi-relational Dynamic Graph Neural Network (MDGNN) framework is proposed.
Our proposed MDGNN framework achieves the best performance in public datasets compared with state-of-the-art (SOTA) stock investment methods.
arXiv Detail & Related papers (2024-01-19T02:51:29Z) - Multi-relational Graph Diffusion Neural Network with Parallel Retention
for Stock Trends Classification [6.383640665055313]
We propose a graph-based representation learning approach aimed at predicting future movements of multiple stocks.
Our approach consistently outperforms state-of-the-art baselines in forecasting next trading day stock trends across three test periods spanning seven years.
arXiv Detail & Related papers (2024-01-05T17:15:45Z) - DGDNN: Decoupled Graph Diffusion Neural Network for Stock Movement
Prediction [8.7861010791349]
We propose a novel graph learning approach implemented without expert knowledge to address these issues.
First, our approach automatically constructs dynamic stock graphs by entropy-driven edge generation from a signal processing perspective.
Last, a decoupled representation learning scheme is adopted to capture distinctive hierarchical intra-stock features.
arXiv Detail & Related papers (2024-01-03T17:36:27Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.