Axial-LOB: High-Frequency Trading with Axial Attention
- URL: http://arxiv.org/abs/2212.01807v1
- Date: Sun, 4 Dec 2022 12:11:03 GMT
- Title: Axial-LOB: High-Frequency Trading with Axial Attention
- Authors: Damian Kisiel, Denise Gorse
- Abstract summary: Axial-LOB is a novel fully-attentional deep learning architecture for predicting price movements of stocks from limit order book (LOB) data.
Our architecture is able to construct feature maps that incorporate global interactions, while significantly reducing the size of the parameter space.
The effectiveness of Axial-LOB is demonstrated on a large benchmark dataset, containing time series representations of millions of high-frequency trading events.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Previous attempts to predict stock price from limit order book (LOB) data are
mostly based on deep convolutional neural networks. Although convolutions offer
efficiency by restricting their operations to local interactions, it is at the
cost of potentially missing out on the detection of long-range dependencies.
Recent studies address this problem by employing additional recurrent or
attention layers that increase computational complexity. In this work, we
propose Axial-LOB, a novel fully-attentional deep learning architecture for
predicting price movements of stocks from LOB data. By utilizing gated
position-sensitive axial attention layers our architecture is able to construct
feature maps that incorporate global interactions, while significantly reducing
the size of the parameter space. Unlike previous works, Axial-LOB does not rely
on hand-crafted convolutional kernels and hence has stable performance under
input permutations and the capacity to incorporate additional LOB features. The
effectiveness of Axial-LOB is demonstrated on a large benchmark dataset,
containing time series representations of millions of high-frequency trading
events, where our model establishes a new state of the art, achieving an
excellent directional classification performance at all tested prediction
horizons.
Related papers
- Influence Maximization via Graph Neural Bandits [54.45552721334886]
We set the IM problem in a multi-round diffusion campaign, aiming to maximize the number of distinct users that are influenced.
We propose the framework IM-GNB (Influence Maximization with Graph Neural Bandits), where we provide an estimate of the users' probabilities of being influenced.
arXiv Detail & Related papers (2024-06-18T17:54:33Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - IB-AdCSCNet:Adaptive Convolutional Sparse Coding Network Driven by Information Bottleneck [4.523653503622693]
We introduce IB-AdCSCNet, a deep learning model grounded in information bottleneck theory.
IB-AdCSCNet seamlessly integrates the information bottleneck trade-off strategy into deep networks.
Experimental results on CIFAR-10 and CIFAR-100 datasets demonstrate that IB-AdCSCNet not only matches the performance of deep residual convolutional networks but also outperforms them when handling corrupted data.
arXiv Detail & Related papers (2024-05-23T05:35:57Z) - Spatio-Temporal Attention Graph Neural Network for Remaining Useful Life
Prediction [1.831835396047386]
This study presents the Spatio-Temporal Attention Graph Neural Network.
Our model combines graph neural networks and temporal convolutional neural networks for spatial and temporal feature extraction.
Comprehensive experiments were conducted on the C-MAPSS dataset to evaluate the impact of unified versus clustering normalization.
arXiv Detail & Related papers (2024-01-29T08:49:53Z) - A Generative Self-Supervised Framework using Functional Connectivity in
fMRI Data [15.211387244155725]
Deep neural networks trained on Functional Connectivity (FC) networks extracted from functional Magnetic Resonance Imaging (fMRI) data have gained popularity.
Recent research on the application of Graph Neural Network (GNN) to FC suggests that exploiting the time-varying properties of the FC could significantly improve the accuracy and interpretability of the model prediction.
High cost of acquiring high-quality fMRI data and corresponding labels poses a hurdle to their application in real-world settings.
We propose a generative SSL approach that is tailored to effectively harnesstemporal information within dynamic FC.
arXiv Detail & Related papers (2023-12-04T16:14:43Z) - Graph Neural Processes for Spatio-Temporal Extrapolation [36.01312116818714]
We study the task of extrapolation-temporal processes that generates data at target locations from surrounding contexts in a graph.
Existing methods either use learning-grained models like Neural Networks or statistical approaches like Gaussian for this task.
We propose Spatio Graph Neural Processes (STGNP), a neural latent variable model which commands these capabilities simultaneously.
arXiv Detail & Related papers (2023-05-30T03:55:37Z) - Dense Network Expansion for Class Incremental Learning [61.00081795200547]
State-of-the-art approaches use a dynamic architecture based on network expansion (NE), in which a task expert is added per task.
A new NE method, dense network expansion (DNE), is proposed to achieve a better trade-off between accuracy and model complexity.
It outperforms the previous SOTA methods by a margin of 4% in terms of accuracy, with similar or even smaller model scale.
arXiv Detail & Related papers (2023-03-22T16:42:26Z) - Efficient Graph Neural Network Inference at Large Scale [54.89457550773165]
Graph neural networks (GNNs) have demonstrated excellent performance in a wide range of applications.
Existing scalable GNNs leverage linear propagation to preprocess the features and accelerate the training and inference procedure.
We propose a novel adaptive propagation order approach that generates the personalized propagation order for each node based on its topological information.
arXiv Detail & Related papers (2022-11-01T14:38:18Z) - The Limit Order Book Recreation Model (LOBRM): An Extended Analysis [2.0305676256390934]
The microstructure order book (LOB) depicts the fine-ahead-ahead demand and supply relationship for financial assets.
LOBRM was recently proposed to bridge this gap by synthesizing the LOB from trades and quotes (TAQ) data.
We extend the research on LOBRM and further validate its use in real-world application scenarios.
arXiv Detail & Related papers (2021-07-01T15:25:21Z) - Channelized Axial Attention for Semantic Segmentation [70.14921019774793]
We propose the Channelized Axial Attention (CAA) to seamlessly integratechannel attention and axial attention with reduced computationalcomplexity.
Our CAA not onlyrequires much less computation resources compared with otherdual attention models such as DANet, but also outperforms the state-of-the-art ResNet-101-based segmentation models on alltested datasets.
arXiv Detail & Related papers (2021-01-19T03:08:03Z) - A Spatial-Temporal Attentive Network with Spatial Continuity for
Trajectory Prediction [74.00750936752418]
We propose a novel model named spatial-temporal attentive network with spatial continuity (STAN-SC)
First, spatial-temporal attention mechanism is presented to explore the most useful and important information.
Second, we conduct a joint feature sequence based on the sequence and instant state information to make the generative trajectories keep spatial continuity.
arXiv Detail & Related papers (2020-03-13T04:35:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.