BreakGPT: Leveraging Large Language Models for Predicting Asset Price Surges
- URL: http://arxiv.org/abs/2411.06076v1
- Date: Sat, 09 Nov 2024 05:40:32 GMT
- Title: BreakGPT: Leveraging Large Language Models for Predicting Asset Price Surges
- Authors: Aleksandr Simonyan,
- Abstract summary: This paper introduces BreakGPT, a novel large language model (LLM) architecture adapted specifically for time series forecasting and the prediction of sharp upward movements in asset prices.
We showcase BreakGPT as a promising solution for financial forecasting with minimal training and as a strong competitor for capturing both local and global temporal dependencies.
- Score: 55.2480439325792
- License:
- Abstract: This paper introduces BreakGPT, a novel large language model (LLM) architecture adapted specifically for time series forecasting and the prediction of sharp upward movements in asset prices. By leveraging both the capabilities of LLMs and Transformer-based models, this study evaluates BreakGPT and other Transformer-based models for their ability to address the unique challenges posed by highly volatile financial markets. The primary contribution of this work lies in demonstrating the effectiveness of combining time series representation learning with LLM prediction frameworks. We showcase BreakGPT as a promising solution for financial forecasting with minimal training and as a strong competitor for capturing both local and global temporal dependencies.
Related papers
- TimeCAP: Learning to Contextualize, Augment, and Predict Time Series Events with Large Language Model Agents [52.13094810313054]
TimeCAP is a time-series processing framework that creatively employs Large Language Models (LLMs) as contextualizers of time series data.
TimeCAP incorporates two independent LLM agents: one generates a textual summary capturing the context of the time series, while the other uses this enriched summary to make more informed predictions.
Experimental results on real-world datasets demonstrate that TimeCAP outperforms state-of-the-art methods for time series event prediction.
arXiv Detail & Related papers (2025-02-17T04:17:27Z) - Scalable Language Models with Posterior Inference of Latent Thought Vectors [52.63299874322121]
Latent-Thought Language Models (LTMs) incorporate explicit latent thought vectors that follow an explicit prior model in latent space.
LTMs possess additional scaling dimensions beyond traditional LLMs, yielding a structured design space.
LTMs significantly outperform conventional autoregressive models and discrete diffusion models in validation perplexity and zero-shot language modeling.
arXiv Detail & Related papers (2025-02-03T17:50:34Z) - Harnessing Earnings Reports for Stock Predictions: A QLoRA-Enhanced LLM Approach [6.112119533910774]
This paper introduces an advanced approach by employing Large Language Models (LLMs) instruction fine-tuned with a novel combination of instruction-based techniques and quantized low-rank adaptation (QLoRA) compression.
Our methodology integrates 'base factors', such as financial metric growth and earnings transcripts, with 'external factors', including recent market indices performances and analyst grades, to create a rich, supervised dataset.
This study not only demonstrates the power of integrating cutting-edge AI with fine-tuned financial data but also paves the way for future research in enhancing AI-driven financial analysis tools.
arXiv Detail & Related papers (2024-08-13T04:53:31Z) - Large Investment Model [7.712869313074975]
Large Investment Model (LIM) is a novel research paradigm designed to enhance both performance and efficiency at scale.
LIM employs end-to-end learning and universal modeling to create an upstream foundation model capable of autonomously learning comprehensive signal patterns from diverse financial data.
arXiv Detail & Related papers (2024-08-12T05:15:13Z) - A Combination Model Based on Sequential General Variational Mode Decomposition Method for Time Series Prediction [11.11205499754577]
We construct a new SGVMD-ARIMA combination model in a non-linear way to predict financial time series.
Within the prediction interval, our proposed combination model has improved advantages over traditional decomposition prediction control group models.
arXiv Detail & Related papers (2024-06-05T11:35:38Z) - ECC Analyzer: Extract Trading Signal from Earnings Conference Calls using Large Language Model for Stock Performance Prediction [7.358590821647365]
This research introduces a novel framework: textbfECC Analyzer, which utilizes large language models (LLMs) to extract richer, more predictive content from ECCs.
We use the pre-trained large models to extract textual and audio features from ECCs and implement a hierarchical information extraction strategy to extract more fine-grained information.
Experimental results demonstrate that our model outperforms traditional analytical benchmarks.
arXiv Detail & Related papers (2024-04-29T07:11:39Z) - Financial Time-Series Forecasting: Towards Synergizing Performance And
Interpretability Within a Hybrid Machine Learning Approach [2.0213537170294793]
This paper propose a comparative study on hybrid machine learning algorithms and leverage on enhancing model interpretability.
For the interpretability, we carry out a systematic overview on the preprocessing techniques of time-series statistics, including decomposition, auto-correlational function, exponential triple forecasting, which aim to excavate latent relations and complex patterns appeared in the financial time-series forecasting.
arXiv Detail & Related papers (2023-12-31T16:38:32Z) - Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [110.20279343734548]
Time series forecasting holds significant importance in many real-world dynamic systems.
We present Time-LLM, a reprogramming framework to repurpose large language models for time series forecasting.
Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models.
arXiv Detail & Related papers (2023-10-03T01:31:25Z) - Can ChatGPT Forecast Stock Price Movements? Return Predictability and Large Language Models [51.3422222472898]
We document the capability of large language models (LLMs) like ChatGPT to predict stock price movements using news headlines.
We develop a theoretical model incorporating information capacity constraints, underreaction, limits-to-arbitrage, and LLMs.
arXiv Detail & Related papers (2023-04-15T19:22:37Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.