Natural Language Processing and Multimodal Stock Price Prediction
- URL: http://arxiv.org/abs/2401.01487v1
- Date: Wed, 3 Jan 2024 01:21:30 GMT
- Title: Natural Language Processing and Multimodal Stock Price Prediction
- Authors: Kevin Taylor and Jerry Ng
- Abstract summary: This paper utilizes stock percentage change as training data, in contrast to the traditional use of raw currency values.
The choice of percentage change aims to provide models with context regarding the significance of price fluctuations.
The study employs specialized BERT natural language processing models to predict stock price trends.
- Score: 0.8702432681310401
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the realm of financial decision-making, predicting stock prices is
pivotal. Artificial intelligence techniques such as long short-term memory
networks (LSTMs), support-vector machines (SVMs), and natural language
processing (NLP) models are commonly employed to predict said prices. This
paper utilizes stock percentage change as training data, in contrast to the
traditional use of raw currency values, with a focus on analyzing publicly
released news articles. The choice of percentage change aims to provide models
with context regarding the significance of price fluctuations and overall price
change impact on a given stock. The study employs specialized BERT natural
language processing models to predict stock price trends, with a particular
emphasis on various data modalities. The results showcase the capabilities of
such strategies with a small natural language processing model to accurately
predict overall stock trends, and highlight the effectiveness of certain data
features and sector-specific data.
Related papers
- Context is Key: A Benchmark for Forecasting with Essential Textual Information [87.3175915185287]
"Context is Key" (CiK) is a time series forecasting benchmark that pairs numerical data with diverse types of carefully crafted textual context.
We evaluate a range of approaches, including statistical models, time series foundation models, and LLM-based forecasters.
Our experiments highlight the importance of incorporating contextual information, demonstrate surprising performance when using LLM-based forecasting models, and also reveal some of their critical shortcomings.
arXiv Detail & Related papers (2024-10-24T17:56:08Z) - Harnessing Earnings Reports for Stock Predictions: A QLoRA-Enhanced LLM Approach [6.112119533910774]
This paper introduces an advanced approach by employing Large Language Models (LLMs) instruction fine-tuned with a novel combination of instruction-based techniques and quantized low-rank adaptation (QLoRA) compression.
Our methodology integrates 'base factors', such as financial metric growth and earnings transcripts, with 'external factors', including recent market indices performances and analyst grades, to create a rich, supervised dataset.
This study not only demonstrates the power of integrating cutting-edge AI with fine-tuned financial data but also paves the way for future research in enhancing AI-driven financial analysis tools.
arXiv Detail & Related papers (2024-08-13T04:53:31Z) - American Option Pricing using Self-Attention GRU and Shapley Value
Interpretation [0.0]
We propose a machine learning method for forecasting the prices of SPY (ETF) option based on gated recurrent unit (GRU) and self-attention mechanism.
We built four different machine learning models, including multilayer perceptron (MLP), long short-term memory (LSTM), self-attention LSTM, and self-attention GRU.
arXiv Detail & Related papers (2023-10-19T06:05:46Z) - Predicting Financial Market Trends using Time Series Analysis and
Natural Language Processing [0.0]
This study was to assess the viability of Twitter sentiments as a tool for predicting stock prices of major corporations such as Tesla, Apple.
Our findings indicate that positivity, negativity, and subjectivity are the primary determinants of fluctuations in stock prices.
arXiv Detail & Related papers (2023-08-31T21:20:58Z) - Diffusion Variational Autoencoder for Tackling Stochasticity in
Multi-Step Regression Stock Price Prediction [54.21695754082441]
Multi-step stock price prediction over a long-term horizon is crucial for forecasting its volatility.
Current solutions to multi-step stock price prediction are mostly designed for single-step, classification-based predictions.
We combine a deep hierarchical variational-autoencoder (VAE) and diffusion probabilistic techniques to do seq2seq stock prediction.
Our model is shown to outperform state-of-the-art solutions in terms of its prediction accuracy and variance.
arXiv Detail & Related papers (2023-08-18T16:21:15Z) - Can ChatGPT Forecast Stock Price Movements? Return Predictability and Large Language Models [51.3422222472898]
We document the capability of large language models (LLMs) like ChatGPT to predict stock price movements using news headlines.
We develop a theoretical model incorporating information capacity constraints, underreaction, limits-to-arbitrage, and LLMs.
arXiv Detail & Related papers (2023-04-15T19:22:37Z) - ASPEST: Bridging the Gap Between Active Learning and Selective
Prediction [56.001808843574395]
Selective prediction aims to learn a reliable model that abstains from making predictions when uncertain.
Active learning aims to lower the overall labeling effort, and hence human dependence, by querying the most informative examples.
In this work, we introduce a new learning paradigm, active selective prediction, which aims to query more informative samples from the shifted target domain.
arXiv Detail & Related papers (2023-04-07T23:51:07Z) - Forecasting Cryptocurrency Returns from Sentiment Signals: An Analysis
of BERT Classifiers and Weak Supervision [6.624726878647541]
We introduce weak learning, a recently proposed NLP approach to address the problem that text data is unlabeled.
We confirm that finetuning using weak labels enhances the predictive value of text-based features and raises forecast accuracy in the context of predicting cryptocurrency returns.
More fundamentally, the modeling paradigm we present, weak labeling domain-specific text and finetuning pretrained NLP models, is universally applicable in (financial) forecasting.
arXiv Detail & Related papers (2022-04-06T07:45:05Z) - Bayesian Bilinear Neural Network for Predicting the Mid-price Dynamics
in Limit-Order Book Markets [84.90242084523565]
Traditional time-series econometric methods often appear incapable of capturing the true complexity of the multi-level interactions driving the price dynamics.
By adopting a state-of-the-art second-order optimization algorithm, we train a Bayesian bilinear neural network with temporal attention.
By addressing the use of predictive distributions to analyze errors and uncertainties associated with the estimated parameters and model forecasts, we thoroughly compare our Bayesian model with traditional ML alternatives.
arXiv Detail & Related papers (2022-03-07T18:59:54Z) - Design and Analysis of Robust Deep Learning Models for Stock Price
Prediction [0.0]
Building predictive models for robust and accurate prediction of stock prices and stock price movement is a challenging research problem to solve.
This chapter proposes a collection of predictive regression models built on deep learning architecture for robust and precise prediction of the future prices of a stock listed in the diversified sectors in the National Stock Exchange (NSE) of India.
arXiv Detail & Related papers (2021-06-17T17:15:02Z) - Analyzing the Source and Target Contributions to Predictions in Neural
Machine Translation [97.22768624862111]
We analyze NMT models which explicitly evaluates the source and target relative contributions to the generation process.
We find that models trained with more data tend to rely on source information more and to have more sharp token contributions.
arXiv Detail & Related papers (2020-10-21T11:37:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.