Quantifying Cryptocurrency Unpredictability: A Comprehensive Study of Complexity and Forecasting
- URL: http://arxiv.org/abs/2502.09079v1
- Date: Thu, 13 Feb 2025 08:53:13 GMT
- Title: Quantifying Cryptocurrency Unpredictability: A Comprehensive Study of Complexity and Forecasting
- Authors: Francesco Puoti, Fabrizio Pittorino, Manuel Roveri,
- Abstract summary: We explore the cryptocurrencies time-series forecasting task focusing on the exchange rate in Litecoin, Coin, Bitcoin, and USD.
Results reveal that cryptocurrency time-series exhibit characteristics closely resembling those of Brownian noise.
The application of a wide range of statistical, machine and deep learning models for time-series forecasting demonstrates the low predictability of cryptocurrencies.
- Score: 3.724847012963521
- License:
- Abstract: This paper offers a thorough examination of the univariate predictability in cryptocurrency time-series. By exploiting a combination of complexity measure and model predictions we explore the cryptocurrencies time-series forecasting task focusing on the exchange rate in USD of Litecoin, Binance Coin, Bitcoin, Ethereum, and XRP. On one hand, to assess the complexity and the randomness of these time-series, a comparative analysis has been performed using Brownian and colored noises as a benchmark. The results obtained from the Complexity-Entropy causality plane and power density spectrum analysis reveal that cryptocurrency time-series exhibit characteristics closely resembling those of Brownian noise when analyzed in a univariate context. On the other hand, the application of a wide range of statistical, machine and deep learning models for time-series forecasting demonstrates the low predictability of cryptocurrencies. Notably, our analysis reveals that simpler models such as Naive models consistently outperform the more complex machine and deep learning ones in terms of forecasting accuracy across different forecast horizons and time windows. The combined study of complexity and forecasting accuracies highlights the difficulty of predicting the cryptocurrency market. These findings provide valuable insights into the inherent characteristics of the cryptocurrency data and highlight the need to reassess the challenges associated with predicting cryptocurrency's price movements.
Related papers
- CryptoMamba: Leveraging State Space Models for Accurate Bitcoin Price Prediction [28.15955243872829]
We propose CryptoMamba, a novel Mamba-based State Space Model (SSM) architecture designed to capture long-range dependencies in financial time-series data.
Our experiments show that CryptoMamba not only provides more accurate predictions but also offers enhanced generalizability across different market conditions.
Our findings signal a huge advantage for SSMs in stock and cryptocurrency price forecasting tasks.
arXiv Detail & Related papers (2025-01-02T02:16:56Z) - Multi-Source Hard and Soft Information Fusion Approach for Accurate Cryptocurrency Price Movement Prediction [5.885853464728419]
We introduce a novel approach termed hard and soft information fusion (HSIF) to enhance the accuracy of cryptocurrency price movement forecasts.
Our model has about 96.8% accuracy in predicting price movement.
incorporating information enables our model to grasp the influence of social sentiment on price fluctuations.
arXiv Detail & Related papers (2024-09-27T16:32:57Z) - Review of deep learning models for crypto price prediction: implementation and evaluation [5.240745112593501]
We review the literature about deep learning for cryptocurrency price forecasting and evaluate novel deep learning models for cryptocurrency stock price prediction.
Our deep learning models include variants of long short-term memory (LSTM) recurrent neural networks, variants of convolutional neural networks (CNNs) and the Transformer model.
We also carry out volatility analysis on the four cryptocurrencies which reveals significant fluctuations in their prices throughout the COVID-19 pandemic.
arXiv Detail & Related papers (2024-05-19T03:15:27Z) - Financial Time-Series Forecasting: Towards Synergizing Performance And
Interpretability Within a Hybrid Machine Learning Approach [2.0213537170294793]
This paper propose a comparative study on hybrid machine learning algorithms and leverage on enhancing model interpretability.
For the interpretability, we carry out a systematic overview on the preprocessing techniques of time-series statistics, including decomposition, auto-correlational function, exponential triple forecasting, which aim to excavate latent relations and complex patterns appeared in the financial time-series forecasting.
arXiv Detail & Related papers (2023-12-31T16:38:32Z) - Hawkes-based cryptocurrency forecasting via Limit Order Book data [1.6236898718152877]
We present a novel prediction algorithm using limit order book (LOB) data rooted in the Hawkes model.
Our approach offers a precise forecast of return signs by leveraging predictions of future financial interactions.
The efficacy of our approach is validated through Monte Carlo simulations across 50 scenarios.
arXiv Detail & Related papers (2023-12-21T16:31:07Z) - Diffusion Variational Autoencoder for Tackling Stochasticity in
Multi-Step Regression Stock Price Prediction [54.21695754082441]
Multi-step stock price prediction over a long-term horizon is crucial for forecasting its volatility.
Current solutions to multi-step stock price prediction are mostly designed for single-step, classification-based predictions.
We combine a deep hierarchical variational-autoencoder (VAE) and diffusion probabilistic techniques to do seq2seq stock prediction.
Our model is shown to outperform state-of-the-art solutions in terms of its prediction accuracy and variance.
arXiv Detail & Related papers (2023-08-18T16:21:15Z) - Predicting the State of Synchronization of Financial Time Series using
Cross Recurrence Plots [75.20174445166997]
This study introduces a new method for predicting the future state of synchronization of the dynamics of two financial time series.
We adopt a deep learning framework for methodologically addressing the prediction of the synchronization state.
We find that the task of predicting the state of synchronization of two time series is in general rather difficult, but for certain pairs of stocks attainable with very satisfactory performance.
arXiv Detail & Related papers (2022-10-26T10:22:28Z) - Bayesian Bilinear Neural Network for Predicting the Mid-price Dynamics
in Limit-Order Book Markets [84.90242084523565]
Traditional time-series econometric methods often appear incapable of capturing the true complexity of the multi-level interactions driving the price dynamics.
By adopting a state-of-the-art second-order optimization algorithm, we train a Bayesian bilinear neural network with temporal attention.
By addressing the use of predictive distributions to analyze errors and uncertainties associated with the estimated parameters and model forecasts, we thoroughly compare our Bayesian model with traditional ML alternatives.
arXiv Detail & Related papers (2022-03-07T18:59:54Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Understanding Neural Abstractive Summarization Models via Uncertainty [54.37665950633147]
seq2seq abstractive summarization models generate text in a free-form manner.
We study the entropy, or uncertainty, of the model's token-level predictions.
We show that uncertainty is a useful perspective for analyzing summarization and text generation models more broadly.
arXiv Detail & Related papers (2020-10-15T16:57:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.