Forecasting VIX using interpretable Kolmogorov-Arnold networks
- URL: http://arxiv.org/abs/2502.00980v1
- Date: Mon, 03 Feb 2025 01:24:02 GMT
- Title: Forecasting VIX using interpretable Kolmogorov-Arnold networks
- Authors: So-Yoon Cho, Sungchul Lee, Hyun-Gyoon Kim,
- Abstract summary: This paper presents the use of Kolmogorov-Arnold Networks (KANs) for forecasting the CBOE Volatility Index (VIX)
Unlike traditional-based neural networks that are often criticized for their black-box nature, KAN offers an interpretable approach via learnable spline-based activation functions and symbolification.
- Score: 2.145761568085819
- License:
- Abstract: This paper presents the use of Kolmogorov-Arnold Networks (KANs) for forecasting the CBOE Volatility Index (VIX). Unlike traditional MLP-based neural networks that are often criticized for their black-box nature, KAN offers an interpretable approach via learnable spline-based activation functions and symbolification. Based on a parsimonious architecture with symbolic functions, KAN expresses a forecast of the VIX as a closed-form in terms of explanatory variables, and provide interpretable insights into key characteristics of the VIX, including mean reversion and the leverage effect. Through in-depth empirical analysis across multiple datasets and periods, we show that KANs achieve competitive forecasting performance while requiring significantly fewer parameters compared to MLP-based neural network models. Our findings demonstrate the capacity and potential of KAN as an interpretable financial time-series forecasting method.
Related papers
- NBMLSS: probabilistic forecasting of electricity prices via Neural Basis Models for Location Scale and Shape [44.99833362998488]
We deploy a Neural Basis Model for Location, Scale and Shape, that blends the principled interpretability of GAMLSS with a computationally scalable shared basis decomposition.
Experiments have been conducted on multiple market regions, achieving probabilistic forecasting performance comparable to that of distributional neural networks.
arXiv Detail & Related papers (2024-11-21T08:17:53Z) - Kolmogorov-Arnold Networks for Time Series: Bridging Predictive Power and Interpretability [6.4314326272535896]
Kolmogorov-Arnold Networks (KAN) is a groundbreaking model recently proposed by the MIT team.
KAN is designed to detect concept drift within time series and can explain the nonlinear relationships between predictions and previous time steps.
T-KAN is designed to detect concept drift within time series and can explain the nonlinear relationships between predictions and previous time steps.
MT-KAN, on the other hand, improves predictive performance by effectively uncovering and leveraging the complex relationships among variables.
arXiv Detail & Related papers (2024-06-04T17:14:31Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - Smooth Kolmogorov Arnold networks enabling structural knowledge representation [0.0]
Kolmogorov-Arnold Networks (KANs) offer an efficient and interpretable alternative to traditional multi-layer perceptron (MLP) architectures.
By leveraging inherent structural knowledge, KANs may reduce the data required for training and mitigate the risk of generating hallucinated predictions.
arXiv Detail & Related papers (2024-05-18T15:27:14Z) - Kolmogorov-Arnold Networks (KANs) for Time Series Analysis [6.932243286441558]
We introduce a novel application of Kolmogorov-Arnold Networks (KANs) to time series forecasting.
Inspired by the Kolmogorov-Arnold representation theorem, KANs replace traditional linear weights with spline-parametrized univariate functions.
We demonstrate that KANs outperforms conventional Multi-Layer Perceptrons (MLPs) in a real-world satellite traffic forecasting task.
arXiv Detail & Related papers (2024-05-14T17:38:17Z) - Interpretable Social Anchors for Human Trajectory Forecasting in Crowds [84.20437268671733]
We propose a neural network-based system to predict human trajectory in crowds.
We learn interpretable rule-based intents, and then utilise the expressibility of neural networks to model scene-specific residual.
Our architecture is tested on the interaction-centric benchmark TrajNet++.
arXiv Detail & Related papers (2021-05-07T09:22:34Z) - Neural basis expansion analysis with exogenous variables: Forecasting
electricity prices with NBEATSx [12.31979377566269]
We study the utility of the NBEATSx model in electricity price forecasting tasks across a broad range of years and markets.
We observe state-of-the-art performance, significantly improving the forecast accuracy by nearly 20% over the original NBEATS model.
The proposed neural network has an interpretable configuration that can structurally decompose time series.
arXiv Detail & Related papers (2021-04-12T14:47:55Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Probabilistic Graph Attention Network with Conditional Kernels for
Pixel-Wise Prediction [158.88345945211185]
We present a novel approach that advances the state of the art on pixel-level prediction in a fundamental aspect, i.e. structured multi-scale features learning and fusion.
We propose a probabilistic graph attention network structure based on a novel Attention-Gated Conditional Random Fields (AG-CRFs) model for learning and fusing multi-scale representations in a principled manner.
arXiv Detail & Related papers (2021-01-08T04:14:29Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.