Feature selection and regression methods for stock price prediction
using technical indicators
- URL: http://arxiv.org/abs/2310.09903v4
- Date: Mon, 6 Nov 2023 15:50:48 GMT
- Title: Feature selection and regression methods for stock price prediction
using technical indicators
- Authors: Fatemeh Moodi, Amir Jahangard-Rafsanjani, Sajad Zarifzadeh
- Abstract summary: This study uses technical indicators and features selection and regression methods to solve the problem of closing the stock market price.
The use of suitable combination of suggested indicators along with regression methods has resulted in high accuracy in predicting the closing price.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Due to the influence of many factors, including technical indicators on stock
price prediction, feature selection is important to choose the best indicators.
This study uses technical indicators and features selection and regression
methods to solve the problem of closing the stock market price. The aim of this
research is to predict the stock market price with the least error. By the
proposed method, the data created by the 3-day time window were converted to
the appropriate input for regression methods. In this paper, 10 regressor and
123 technical indicators have been examined on data of the last 13 years of
Apple Company. The results have been investigated by 5 error-based evaluation
criteria. Based on results of the proposed method, MLPSF has 56/47% better
performance than MLP. Also, SVRSF has 67/42% improved compared to SVR. LRSF was
76.7 % improved compared to LR. The RISF method also improved 72.82 % of Ridge
regression. The DTRSB method had 24.23 % improvement over DTR. KNNSB had 15.52
% improvement over KNN regression. RFSB had a 6 % improvement over RF. GBRSF
also improved at 7% over GBR. Finally, ADASF and ADASB also had a 4%
improvement over the ADA regression. Also, Ridge and LinearRegression had the
best results for stock price prediction. Based on results, the best indicators
to predict stock price are: the Squeeze_pro, Percentage Price Oscillator,
Thermo, Decay, Archer On-Balance Volume, Bollinger Bands, Squeeze and Ichimoku
indicator. According to the results, the use of suitable combination of
suggested indicators along with regression methods has resulted in high
accuracy in predicting the closing price.
Related papers
- Electricity Price Prediction Using Multi-Kernel Gaussian Process Regression Combined with Kernel-Based Support Vector Regression [0.0]
The paper presents a new hybrid model for predicting German electricity prices.
The algorithm is based on combining Gaussian Process Regression (GPR) and Support Regression Vector (SVR)
arXiv Detail & Related papers (2024-11-28T10:32:50Z) - Gated recurrent neural network with TPE Bayesian optimization for enhancing stock index prediction accuracy [0.0]
The aim is to improve the prediction accuracy of the next day's closing price of the NIFTY 50 index, a prominent Indian stock market index.
A combination of eight influential factors is carefully chosen from fundamental stock data, technical indicators, crude oil price, and macroeconomic data to train the models.
arXiv Detail & Related papers (2024-06-02T06:39:01Z) - Diffusion Variational Autoencoder for Tackling Stochasticity in
Multi-Step Regression Stock Price Prediction [54.21695754082441]
Multi-step stock price prediction over a long-term horizon is crucial for forecasting its volatility.
Current solutions to multi-step stock price prediction are mostly designed for single-step, classification-based predictions.
We combine a deep hierarchical variational-autoencoder (VAE) and diffusion probabilistic techniques to do seq2seq stock prediction.
Our model is shown to outperform state-of-the-art solutions in terms of its prediction accuracy and variance.
arXiv Detail & Related papers (2023-08-18T16:21:15Z) - Using a Deep Learning Model to Simulate Human Stock Trader's Methods of Chart Analysis [0.276240219662896]
The proposed scheme looks at stock prices of the previous 600 days and predicts whether the stock price will rise or fall 10% or 20% within the next D days.
Using the proposed method for the Korea market it gave return of 75.36% having Sharpe ratio of 1.57, which far exceeds the market return by 36% and 0.61, respectively.
On the US market it gives total return of 27.17% with Sharpe ratio of 0.61, which outperforms other benchmarks such as NASDAQ, S&P500, DOW JONES index by 17.69% and 0.27, respectively.
arXiv Detail & Related papers (2023-04-28T14:27:18Z) - Performance Evaluation of Regression Models in Predicting the Cost of
Medical Insurance [0.0]
Three (3) Regression Models in Machine Learning namely Linear Regression, Gradient Boosting, and Support Vector Machine were used.
The performance will be evaluated using the metrics RMSE (Root Mean Square), r2 (R Square), and K-Fold Cross-validation.
arXiv Detail & Related papers (2023-04-25T06:33:49Z) - Feature Selection with Annealing for Forecasting Financial Time Series [2.44755919161855]
This study provides a comprehensive method for forecasting financial time series based on tactical input output feature mapping techniques using machine learning (ML) models.
Experiments indicate that the FSA algorithm increased the performance of ML models, regardless of problem type.
arXiv Detail & Related papers (2023-03-03T21:33:38Z) - Doubly Robust Distributionally Robust Off-Policy Evaluation and Learning [59.02006924867438]
Off-policy evaluation and learning (OPE/L) use offline observational data to make better decisions.
Recent work proposed distributionally robust OPE/L (DROPE/L) to remedy this, but the proposal relies on inverse-propensity weighting.
We propose the first DR algorithms for DROPE/L with KL-divergence uncertainty sets.
arXiv Detail & Related papers (2022-02-19T20:00:44Z) - Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing
Regressions In NLP Model Updates [68.09049111171862]
This work focuses on quantifying, reducing and analyzing regression errors in the NLP model updates.
We formulate the regression-free model updates into a constrained optimization problem.
We empirically analyze how model ensemble reduces regression.
arXiv Detail & Related papers (2021-05-07T03:33:00Z) - SLOE: A Faster Method for Statistical Inference in High-Dimensional
Logistic Regression [68.66245730450915]
We develop an improved method for debiasing predictions and estimating frequentist uncertainty for practical datasets.
Our main contribution is SLOE, an estimator of the signal strength with convergence guarantees that reduces the computation time of estimation and inference by orders of magnitude.
arXiv Detail & Related papers (2021-03-23T17:48:56Z) - Feature Learning for Stock Price Prediction Shows a Significant Role of
Analyst Rating [0.38073142980733]
A set of 5 technical indicators and 23 fundamental indicators was identified to establish the possibility of generating excess returns on the stock market.
From any given day, we were able to predict the direction of change in price by 1% up to 10 days in the future.
The predictions had an overall accuracy of 83.62% with a precision of 85% for buy signals and a recall of 100% for sell signals.
arXiv Detail & Related papers (2021-03-13T03:56:29Z) - A Sentiment Analysis Approach to the Prediction of Market Volatility [62.997667081978825]
We have explored the relationship between sentiment extracted from financial news and tweets and FTSE100 movements.
The sentiment captured from news headlines could be used as a signal to predict market returns; the same does not apply for volatility.
We developed an accurate classifier for the prediction of market volatility in response to the arrival of new information.
arXiv Detail & Related papers (2020-12-10T01:15:48Z) - Deep Stock Predictions [58.720142291102135]
We consider the design of a trading strategy that performs portfolio optimization using Long Short Term Memory (LSTM) neural networks.
We then customize the loss function used to train the LSTM to increase the profit earned.
We find the LSTM model with the customized loss function to have an improved performance in the training bot over a regressive baseline such as ARIMA.
arXiv Detail & Related papers (2020-06-08T23:37:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.