Gated Res2Net for Multivariate Time Series Analysis
- URL: http://arxiv.org/abs/2009.11705v1
- Date: Sat, 19 Sep 2020 01:45:41 GMT
- Title: Gated Res2Net for Multivariate Time Series Analysis
- Authors: Chao Yang, Mingxing Jiang, Zhongwen Guo and Yuan Liu
- Abstract summary: We propose a backbone convolutional neural network based on the thought of gated mechanism and Res2Net.
GRes2Net has better performances over the state-of-the-art methods thus indicating the superiority.
- Score: 8.685598820025383
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate time series analysis is an important problem in data mining
because of its widespread applications. With the increase of time series data
available for training, implementing deep neural networks in the field of time
series analysis is becoming common. Res2Net, a recently proposed backbone, can
further improve the state-of-the-art networks as it improves the multi-scale
representation ability through connecting different groups of filters. However,
Res2Net ignores the correlations of the feature maps and lacks the control on
the information interaction process. To address that problem, in this paper, we
propose a backbone convolutional neural network based on the thought of gated
mechanism and Res2Net, namely Gated Res2Net (GRes2Net), for multivariate time
series analysis. The hierarchical residual-like connections are influenced by
gates whose values are calculated based on the original feature maps, the
previous output feature maps and the next input feature maps thus considering
the correlations between the feature maps more effectively. Through the
utilization of gated mechanism, the network can control the process of
information sending hence can better capture and utilize the both the temporal
information and the correlations between the feature maps. We evaluate the
GRes2Net on four multivariate time series datasets including two classification
datasets and two forecasting datasets. The results demonstrate that GRes2Net
have better performances over the state-of-the-art methods thus indicating the
superiority
Related papers
- Mining of Switching Sparse Networks for Missing Value Imputation in Multivariate Time Series [7.872208477823466]
MissNet is designed to exploit temporal dependency with a state-space model and inter-correlation by switching sparse networks.
Our algorithm, which scales linearly with reference to the length of the data, alternatively infers networks and fills in missing values using the networks.
arXiv Detail & Related papers (2024-09-16T02:08:33Z) - Multivariate Time-Series Anomaly Detection based on Enhancing Graph Attention Networks with Topological Analysis [31.43159668073136]
Unsupervised anomaly detection in time series is essential in industrial applications, as it significantly reduces the need for manual intervention.
Traditional methods use Graph Neural Networks (GNNs) or Transformers to analyze spatial while RNNs to model temporal dependencies.
This paper introduces a novel temporal model built on an enhanced Graph Attention Network (GAT) for multivariate time series anomaly detection called TopoGDN.
arXiv Detail & Related papers (2024-08-23T14:06:30Z) - TCCT-Net: Two-Stream Network Architecture for Fast and Efficient Engagement Estimation via Behavioral Feature Signals [58.865901821451295]
We present a novel two-stream feature fusion "Tensor-Convolution and Convolution-Transformer Network" (TCCT-Net) architecture.
To better learn the meaningful patterns in the temporal-spatial domain, we design a "CT" stream that integrates a hybrid convolutional-transformer.
In parallel, to efficiently extract rich patterns from the temporal-frequency domain, we introduce a "TC" stream that uses Continuous Wavelet Transform (CWT) to represent information in a 2D tensor form.
arXiv Detail & Related papers (2024-04-15T06:01:48Z) - MTS2Graph: Interpretable Multivariate Time Series Classification with
Temporal Evolving Graphs [1.1756822700775666]
We introduce a new framework for interpreting time series data by extracting and clustering the input representative patterns.
We run experiments on eight datasets of the UCR/UEA archive, along with HAR and PAM datasets.
arXiv Detail & Related papers (2023-06-06T16:24:27Z) - Networked Time Series Imputation via Position-aware Graph Enhanced
Variational Autoencoders [31.953958053709805]
We design a new model named PoGeVon which leverages variational autoencoder (VAE) to predict missing values over both node time series features and graph structures.
Experiment results demonstrate the effectiveness of our model over baselines.
arXiv Detail & Related papers (2023-05-29T21:11:34Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Graph Attention Recurrent Neural Networks for Correlated Time Series
Forecasting -- Full version [16.22449727526222]
We consider a setting where multiple entities inter-act with each other over time and the time-varying statuses of the entities are represented as correlated time series.
To enable accurate forecasting on correlated time series, we proposes graph attention recurrent neural networks.
Experiments on a large real-world speed time series data set suggest that the proposed method is effective and outperforms the state-of-the-art in most settings.
arXiv Detail & Related papers (2021-03-19T12:15:37Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.