MSHyper: Multi-Scale Hypergraph Transformer for Long-Range Time Series
Forecasting
- URL: http://arxiv.org/abs/2401.09261v1
- Date: Wed, 17 Jan 2024 15:12:11 GMT
- Title: MSHyper: Multi-Scale Hypergraph Transformer for Long-Range Time Series
Forecasting
- Authors: Zongjiang Shang, Ling Chen
- Abstract summary: We propose a Multi-Scale Hypergraph Transformer (MSHyper) framework to promote more comprehensive pattern interaction modeling.
MSHyper achieves state-of-the-art performance, reducing prediction errors by an average of 8.73% and 7.15% over the best baseline in MSE and MAE, respectively.
- Score: 7.178309082582536
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Demystifying interactions between temporal patterns of different scales is
fundamental to precise long-range time series forecasting. However, previous
works lack the ability to model high-order interactions. To promote more
comprehensive pattern interaction modeling for long-range time series
forecasting, we propose a Multi-Scale Hypergraph Transformer (MSHyper)
framework. Specifically, a multi-scale hypergraph is introduced to provide
foundations for modeling high-order pattern interactions. Then by treating
hyperedges as nodes, we also build a hyperedge graph to enhance hypergraph
modeling. In addition, a tri-stage message passing mechanism is introduced to
aggregate pattern information and learn the interaction strength between
temporal patterns of different scales. Extensive experiments on five real-world
datasets demonstrate that MSHyper achieves state-of-the-art performance,
reducing prediction errors by an average of 8.73% and 7.15% over the best
baseline in MSE and MAE, respectively.
Related papers
- Ada-MSHyper: Adaptive Multi-Scale Hypergraph Transformer for Time Series Forecasting [5.431115840202783]
We propose Adaptive Multi-Scale Hypergraph Transformer (Ada-MSHyper) for time series forecasting.
Ada-MSHyper achieves state-of-the-art performance, reducing prediction errors by an average of 4.56%, 10.38%, and 4.97% in MSE for long-range, short-range, and ultra-long-range time series forecasting.
arXiv Detail & Related papers (2024-10-31T14:51:09Z) - SPHINX: Structural Prediction using Hypergraph Inference Network [19.853413818941608]
We introduce Structural Prediction using Hypergraph Inference Network (SPHINX), a model that learns to infer a latent hypergraph structure in an unsupervised way.
We show that the recent advancement in k-subset sampling represents a suitable tool for producing discrete hypergraph structures.
The resulting model can generate the higher-order structure necessary for any modern hypergraph neural network.
arXiv Detail & Related papers (2024-10-04T07:49:57Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - Interaction Event Forecasting in Multi-Relational Recursive HyperGraphs: A Temporal Point Process Approach [12.142292322071299]
This work addresses the problem of forecasting higher-order interaction events in multi-relational recursive hypergraphs.
The proposed model, textitRelational Recursive Hyperedge Temporal Point Process (RRHyperTPP), uses an encoder that learns a dynamic node representation based on the historical interaction patterns.
We have experimentally shown that our models perform better than previous state-of-the-art methods for interaction forecasting.
arXiv Detail & Related papers (2024-04-27T15:46:54Z) - Hypergraph Transformer for Semi-Supervised Classification [50.92027313775934]
We propose a novel hypergraph learning framework, HyperGraph Transformer (HyperGT)
HyperGT uses a Transformer-based neural network architecture to effectively consider global correlations among all nodes and hyperedges.
It achieves comprehensive hypergraph representation learning by effectively incorporating global interactions while preserving local connectivity patterns.
arXiv Detail & Related papers (2023-12-18T17:50:52Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Neural Higher-order Pattern (Motif) Prediction in Temporal Networks [9.717332900439432]
We propose the first model, named HIT, for higher-order pattern prediction in temporal hypergraphs.
Hitt extracts the structural representation of a node triplet of interest on the temporal hypergraph and uses it to tell what type of, when, and why the interaction expansion could happen in this triplet.
arXiv Detail & Related papers (2021-06-10T20:42:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.