A Hype-Adjusted Probability Measure for NLP Stock Return Forecasting
- URL: http://arxiv.org/abs/2412.07587v6
- Date: Fri, 07 Feb 2025 19:46:11 GMT
- Title: A Hype-Adjusted Probability Measure for NLP Stock Return Forecasting
- Authors: Zheng Cao, Helyette Geman,
- Abstract summary: This article introduces a Hype-Adjusted Probability Measure in the context of a new Natural Language Processing (NLP) approach for stock return and volatility forecasting.<n>A novel sentiment score equation is proposed to represent the impact of intraday news on forecasting next-period stock return and volatility for selected U.S. semiconductor tickers.
- Score: 6.658767709779308
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This article introduces a Hype-Adjusted Probability Measure in the context of a new Natural Language Processing (NLP) approach for stock return and volatility forecasting. A novel sentiment score equation is proposed to represent the impact of intraday news on forecasting next-period stock return and volatility for selected U.S. semiconductor tickers, a very vibrant industry sector. This work improves the forecast accuracy by addressing news bias, memory, and weight, and incorporating shifts in sentiment direction. More importantly, it extends the use of the remarkable tool of change of Probability Measure developed in the finance of Asset Pricing to NLP forecasting by constructing a Hype-Adjusted Probability Measure, obtained from a redistribution of the weights in the probability space, meant to correct for excessive or insufficient news.
Related papers
- The Uncertainty of Machine Learning Predictions in Asset Pricing [2.0249250133493195]
We show that neural network forecasts of expected returns share the same distribution as classic nonparametric methods.
We incorporate these forecast confidence intervals into an uncertainty-averse investment framework.
arXiv Detail & Related papers (2025-03-01T16:32:00Z) - Future-Guided Learning: A Predictive Approach To Enhance Time-Series Forecasting [4.866362841501992]
We introduce Future-Guided Learning, an approach that enhances time-series event forecasting.
Our approach involves two models: a detection model that analyzes future data to identify critical events and a forecasting model that predicts these events based on present data.
When discrepancies arise between the forecasting and detection models, the forecasting model undergoes more substantial updates.
arXiv Detail & Related papers (2024-10-19T21:22:55Z) - Using dynamic loss weighting to boost improvements in forecast stability [0.9332308328407303]
Rolling origin forecast instability refers to variability in forecasts for a specific period induced by updating the forecast.
It was shown that more stable forecasts can be obtained without harming accuracy by minimizing a composite loss function.
We show that existing dynamic loss weighting methods can achieve this objective and provide insights into why this might be the case.
arXiv Detail & Related papers (2024-09-26T20:21:46Z) - Pre-Finetuning with Impact Duration Awareness for Stock Movement Prediction [25.67779910446609]
This paper introduces a novel dataset, the Impact Duration Estimation dataset (IDED), specifically designed to estimate impact duration based on investor opinions.
Our research establishes that pre-finetuning language models with IDED can enhance performance in text-based stock movement predictions.
arXiv Detail & Related papers (2024-09-25T23:06:55Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Combining predictive distributions of electricity prices: Does
minimizing the CRPS lead to optimal decisions in day-ahead bidding? [0.0]
We study whether using CRPS learning, a novel weighting technique, leads to optimal decisions in day-ahead bidding.
We find that increasing the diversity of an ensemble can have a positive impact on accuracy.
The higher computational cost of using CRPS learning compared to an equal-weighted aggregation of distributions is not offset by higher profits.
arXiv Detail & Related papers (2023-08-29T17:10:38Z) - Diffusion Variational Autoencoder for Tackling Stochasticity in
Multi-Step Regression Stock Price Prediction [54.21695754082441]
Multi-step stock price prediction over a long-term horizon is crucial for forecasting its volatility.
Current solutions to multi-step stock price prediction are mostly designed for single-step, classification-based predictions.
We combine a deep hierarchical variational-autoencoder (VAE) and diffusion probabilistic techniques to do seq2seq stock prediction.
Our model is shown to outperform state-of-the-art solutions in terms of its prediction accuracy and variance.
arXiv Detail & Related papers (2023-08-18T16:21:15Z) - Variational Prediction [95.00085314353436]
We present a technique for learning a variational approximation to the posterior predictive distribution using a variational bound.
This approach can provide good predictive distributions without test time marginalization costs.
arXiv Detail & Related papers (2023-07-14T18:19:31Z) - Prediction-Oriented Bayesian Active Learning [51.426960808684655]
Expected predictive information gain (EPIG) is an acquisition function that measures information gain in the space of predictions rather than parameters.
EPIG leads to stronger predictive performance compared with BALD across a range of datasets and models.
arXiv Detail & Related papers (2023-04-17T10:59:57Z) - Deep Learning Enhanced Realized GARCH [6.211385208178938]
We propose a new approach to volatility modeling by combining deep learning (LSTM) and realized volatility measures.
This LSTM-enhanced realized GARCH framework incorporates and distills modeling advances from financial econometrics, high frequency trading data and deep learning.
arXiv Detail & Related papers (2023-02-16T00:20:43Z) - Volatility forecasting using Deep Learning and sentiment analysis [0.0]
This paper presents a composite model that merges a deep learning approach with sentiment analysis for predicting market volatility.
We then describe a composite forecasting model, a Long-Short-Term-Memory Neural Network method, to use historical sentiment and the previous day's volatility to make forecasts.
arXiv Detail & Related papers (2022-10-22T14:54:33Z) - DeepVol: Volatility Forecasting from High-Frequency Data with Dilated Causal Convolutions [53.37679435230207]
We propose DeepVol, a model based on Dilated Causal Convolutions that uses high-frequency data to forecast day-ahead volatility.
Our empirical results suggest that the proposed deep learning-based approach effectively learns global features from high-frequency data.
arXiv Detail & Related papers (2022-09-23T16:13:47Z) - Forecasting Cryptocurrency Returns from Sentiment Signals: An Analysis
of BERT Classifiers and Weak Supervision [6.624726878647541]
We introduce weak learning, a recently proposed NLP approach to address the problem that text data is unlabeled.
We confirm that finetuning using weak labels enhances the predictive value of text-based features and raises forecast accuracy in the context of predicting cryptocurrency returns.
More fundamentally, the modeling paradigm we present, weak labeling domain-specific text and finetuning pretrained NLP models, is universally applicable in (financial) forecasting.
arXiv Detail & Related papers (2022-04-06T07:45:05Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - A Sentiment Analysis Approach to the Prediction of Market Volatility [62.997667081978825]
We have explored the relationship between sentiment extracted from financial news and tweets and FTSE100 movements.
The sentiment captured from news headlines could be used as a signal to predict market returns; the same does not apply for volatility.
We developed an accurate classifier for the prediction of market volatility in response to the arrival of new information.
arXiv Detail & Related papers (2020-12-10T01:15:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.