Wasserstein GAN: Deep Generation applied on Bitcoins financial time
series
- URL: http://arxiv.org/abs/2107.06008v1
- Date: Tue, 13 Jul 2021 11:59:05 GMT
- Title: Wasserstein GAN: Deep Generation applied on Bitcoins financial time
series
- Authors: Rikli Samuel, Bigler Daniel Nico, Pfenninger Moritz, Osterrieder Joerg
- Abstract summary: We introduce in this paper a deep neural network called the WGAN-GP, a data-driven model that focuses on sample generation.
The WGAN-GP is supposed to learn the underlying structure of the input data, which in our case, is the Bitcoin.
The generated synthetic time series are visually indistinguishable from the real data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modeling financial time series is challenging due to their high volatility
and unexpected happenings on the market. Most financial models and algorithms
trying to fill the lack of historical financial time series struggle to perform
and are highly vulnerable to overfitting. As an alternative, we introduce in
this paper a deep neural network called the WGAN-GP, a data-driven model that
focuses on sample generation. The WGAN-GP consists of a generator and
discriminator function which utilize an LSTM architecture. The WGAN-GP is
supposed to learn the underlying structure of the input data, which in our
case, is the Bitcoin. Bitcoin is unique in its behavior; the prices fluctuate
what makes guessing the price trend hardly impossible. Through adversarial
training, the WGAN-GP should learn the underlying structure of the bitcoin and
generate very similar samples of the bitcoin distribution. The generated
synthetic time series are visually indistinguishable from the real data. But
the numerical results show that the generated data were close to the real data
distribution but distinguishable. The model mainly shows a stable learning
behavior. However, the model has space for optimization, which could be
achieved by adjusting the hyperparameters.
Related papers
- Generalization of Graph Neural Networks is Robust to Model Mismatch [84.01980526069075]
Graph neural networks (GNNs) have demonstrated their effectiveness in various tasks supported by their generalization capabilities.
In this paper, we examine GNNs that operate on geometric graphs generated from manifold models.
Our analysis reveals the robustness of the GNN generalization in the presence of such model mismatch.
arXiv Detail & Related papers (2024-08-25T16:00:44Z) - Comparative Study of Bitcoin Price Prediction [0.0]
We employ five-fold cross-validation to enhance generalization and utilize L2 regularization to reduce overfitting and noise.
Our study demonstrates that the GRUs models offer better accuracy than LSTMs model for predicting Bitcoin's price.
arXiv Detail & Related papers (2024-05-13T18:10:34Z) - A Data-driven Deep Learning Approach for Bitcoin Price Forecasting [10.120972108960425]
We propose a shallow Bidirectional-LSTM (Bi-LSTM) model to forecast bitcoin closing prices in a daily time frame.
We compare the performance with that of other forecasting methods, and show that with the help of the proposed feature engineering method, a shallow deep neural network outperforms other popular price forecasting models.
arXiv Detail & Related papers (2023-10-27T10:35:47Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Motif-aware temporal GCN for fraud detection in signed cryptocurrency
trust networks [8.82136716762572]
Graph convolutional networks (GCNs) are used for processing data that can be represented as graphs.
In this study, we consider the evolving nature of cryptocurrency networks, and use local structural as well as the balance theory to guide the training process.
Experimental results on bitcoin-alpha and bitcoin-otc datasets show that the proposed model outperforms those in the literature.
arXiv Detail & Related papers (2022-11-22T02:03:27Z) - Graph Generative Model for Benchmarking Graph Neural Networks [73.11514658000547]
We introduce a novel graph generative model that learns and reproduces the distribution of real-world graphs in a privacy-controlled way.
Our model can successfully generate privacy-controlled, synthetic substitutes of large-scale real-world graphs that can be effectively used to benchmark GNN models.
arXiv Detail & Related papers (2022-07-10T06:42:02Z) - Ethereum Fraud Detection with Heterogeneous Graph Neural Networks [3.5819974193845328]
Graph analysis algorithms and machine learning techniques detect suspicious transactions that lead to phishing in large transaction networks.
Many graph neural network (GNN) models have been proposed to apply deep learning techniques to graph structures.
We compared the model performance of GNN models on the actual transaction network dataset and phishing reported label data.
arXiv Detail & Related papers (2022-03-23T12:35:59Z) - Discovering Invariant Rationales for Graph Neural Networks [104.61908788639052]
Intrinsic interpretability of graph neural networks (GNNs) is to find a small subset of the input graph's features.
We propose a new strategy of discovering invariant rationale (DIR) to construct intrinsically interpretable GNNs.
arXiv Detail & Related papers (2022-01-30T16:43:40Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - Efficient Robustness Certificates for Discrete Data: Sparsity-Aware
Randomized Smoothing for Graphs, Images and More [85.52940587312256]
We propose a model-agnostic certificate based on the randomized smoothing framework which subsumes earlier work and is tight, efficient, and sparsity-aware.
We show the effectiveness of our approach on a wide variety of models, datasets, and tasks -- specifically highlighting its use for Graph Neural Networks.
arXiv Detail & Related papers (2020-08-29T10:09:02Z) - Forecasting Bitcoin closing price series using linear regression and
neural networks models [4.17510581764131]
We study how to forecast daily closing price series of Bitcoin using data prices and volumes of prior days.
We followed different approaches in parallel, implementing both statistical techniques and machine learning algorithms.
arXiv Detail & Related papers (2020-01-04T21:04:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.