Application of Convolutional Neural Networks with Quasi-Reversibility
Method Results for Option Forecasting
- URL: http://arxiv.org/abs/2208.14385v1
- Date: Thu, 25 Aug 2022 04:08:59 GMT
- Title: Application of Convolutional Neural Networks with Quasi-Reversibility
Method Results for Option Forecasting
- Authors: Zheng Cao, Wenyu Du and Kirill V. Golubnichiy
- Abstract summary: We create and evaluate new empirical mathematical models for the Black-Scholes equation to analyze data for 92,846 companies.
We solve the Black-Scholes (BS) equation forwards in time as an ill-posed inverse problem, using the Quasi-Reversibility Method (QRM) to predict option price for the future one day.
The current stage of research combines QRM with Convolutional Neural Networks (CNN), which learn information across a large number of data points simultaneously.
- Score: 11.730033307068405
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a novel way to apply mathematical finance and machine
learning (ML) to forecast stock options prices. Following results from the
paper Quasi-Reversibility Method and Neural Network Machine Learning to
Solution of Black-Scholes Equations (appeared on the AMS Contemporary
Mathematics journal), we create and evaluate new empirical mathematical models
for the Black-Scholes equation to analyze data for 92,846 companies. We solve
the Black-Scholes (BS) equation forwards in time as an ill-posed inverse
problem, using the Quasi-Reversibility Method (QRM), to predict option price
for the future one day. For each company, we have 13 elements including stock
and option daily prices, volatility, minimizer, etc. Because the market is so
complicated that there exists no perfect model, we apply ML to train algorithms
to make the best prediction. The current stage of research combines QRM with
Convolutional Neural Networks (CNN), which learn information across a large
number of data points simultaneously. We implement CNN to generate new results
by validating and testing on sample market data. We test different ways of
applying CNN and compare our CNN models with previous models to see if
achieving a higher profit rate is possible.
Related papers
- Attribute-to-Delete: Machine Unlearning via Datamodel Matching [65.13151619119782]
Machine unlearning -- efficiently removing a small "forget set" training data on a pre-divertrained machine learning model -- has recently attracted interest.
Recent research shows that machine unlearning techniques do not hold up in such a challenging setting.
arXiv Detail & Related papers (2024-10-30T17:20:10Z) - Pricing American Options using Machine Learning Algorithms [0.0]
This study investigates the application of machine learning algorithms to pricing American options using Monte Carlo simulations.
Traditional models, such as the Black-Scholes-Merton framework, often fail to adequately address the complexities of American options.
By leveraging Monte Carlo methods in conjunction with Least Square Method machine learning was used.
arXiv Detail & Related papers (2024-09-05T02:52:11Z) - GraphCNNpred: A stock market indices prediction using a Graph based deep learning system [0.0]
We give a graph neural network based convolutional neural network (CNN) model, that can be applied on diverse source of data, in the attempt to extract features to predict the trends of indices of textS&textP 500, NASDAQ, DJI, NYSE, and RUSSEL.
Experiments show that the associated models improve the performance of prediction in all indices over the baseline algorithms by about $4% text to 15%$, in terms of F-measure.
arXiv Detail & Related papers (2024-07-04T09:14:24Z) - A Study on Stock Forecasting Using Deep Learning and Statistical Models [3.437407981636465]
This paper will review many deep learning algorithms for stock price forecasting. We use a record of s&p 500 index data for training and testing.
It will discuss various models, including the Auto regression integration moving average model, the Recurrent neural network model, the long short-term model, the convolutional neural network model, and the full convolutional neural network model.
arXiv Detail & Related papers (2024-02-08T16:45:01Z) - Predicting Stock Market Time-Series Data using CNN-LSTM Neural Network
Model [0.0]
Predicting a stock market performance of a company is nearly hard because every time the prices of a company stock keeps changing and not constant.
To track the patterns and the features of data, a CNN-LSTM Neural Network can be made.
The accuracy of the CNN-LSTM NN model is found to be high even when allowed to train on real-time stock market data.
arXiv Detail & Related papers (2023-05-21T08:00:23Z) - nanoLM: an Affordable LLM Pre-training Benchmark via Accurate Loss Prediction across Scales [65.01417261415833]
We present an approach to predict the pre-training loss based on our observations that Maximal Update Parametrization (muP) enables accurate fitting of scaling laws.
With around 14% of the one-time pre-training cost, we can accurately forecast the loss for models up to 52B.
Our goal with nanoLM is to empower researchers with limited resources to reach meaningful conclusions on large models.
arXiv Detail & Related papers (2023-04-14T00:45:01Z) - Augmented Bilinear Network for Incremental Multi-Stock Time-Series
Classification [83.23129279407271]
We propose a method to efficiently retain the knowledge available in a neural network pre-trained on a set of securities.
In our method, the prior knowledge encoded in a pre-trained neural network is maintained by keeping existing connections fixed.
This knowledge is adjusted for the new securities by a set of augmented connections, which are optimized using the new data.
arXiv Detail & Related papers (2022-07-23T18:54:10Z) - Datamodels: Predicting Predictions from Training Data [86.66720175866415]
We present a conceptual framework, datamodeling, for analyzing the behavior of a model class in terms of the training data.
We show that even simple linear datamodels can successfully predict model outputs.
arXiv Detail & Related papers (2022-02-01T18:15:24Z) - Neural Capacitance: A New Perspective of Neural Network Selection via
Edge Dynamics [85.31710759801705]
Current practice requires expensive computational costs in model training for performance prediction.
We propose a novel framework for neural network selection by analyzing the governing dynamics over synaptic connections (edges) during training.
Our framework is built on the fact that back-propagation during neural network training is equivalent to the dynamical evolution of synaptic connections.
arXiv Detail & Related papers (2022-01-11T20:53:15Z) - Low-bit Quantization of Recurrent Neural Network Language Models Using
Alternating Direction Methods of Multipliers [67.688697838109]
This paper presents a novel method to train quantized RNNLMs from scratch using alternating direction methods of multipliers (ADMM)
Experiments on two tasks suggest the proposed ADMM quantization achieved a model size compression factor of up to 31 times over the full precision baseline RNNLMs.
arXiv Detail & Related papers (2021-11-29T09:30:06Z) - Learning Multiple Stock Trading Patterns with Temporal Routing Adaptor
and Optimal Transport [8.617532047238461]
We propose a novel architecture, Temporal Adaptor (TRA), to empower existing stock prediction models with the ability to model multiple stock trading patterns.
TRA is a lightweight module that consists of a set independent predictors for learning multiple patterns as well as a router to dispatch samples to different predictors.
We show that the proposed method can improve information coefficient (IC) from 0.053 to 0.059 and 0.051 to 0.056 respectively.
arXiv Detail & Related papers (2021-06-24T12:19:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.