RTFN: Robust Temporal Feature Network
- URL: http://arxiv.org/abs/2008.07707v2
- Date: Tue, 29 Dec 2020 02:03:05 GMT
- Title: RTFN: Robust Temporal Feature Network
- Authors: Zhiwen Xiao, Xin Xu, Huanlai Xing and Juan Chen
- Abstract summary: We propose a novel robust temporal feature network (RTFN) that contains temporal feature networks and attentional LSTM networks.
The basic temporal networks are built to capture complicated shapelets and relationships to enrich data.
In experiments, we embedN into supervised structure as a feature network and into unsupervised clustering as an encoder.
- Score: 10.304629792265223
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series analysis plays a vital role in various applications, for
instance, healthcare, weather prediction, disaster forecast, etc. However, to
obtain sufficient shapelets by a feature network is still challenging. To this
end, we propose a novel robust temporal feature network (RTFN) that contains
temporal feature networks and attentional LSTM networks. The temporal feature
networks are built to extract basic features from input data while the
attentional LSTM networks are devised to capture complicated shapelets and
relationships to enrich features. In experiments, we embed RTFN into supervised
structure as a feature extraction network and into unsupervised clustering as
an encoder, respectively. The results show that the RTFN-based supervised
structure is a winner of 40 out of 85 datasets and the RTFN-based unsupervised
clustering performs the best on 4 out of 11 datasets in the UCR2018 archive.
Related papers
- Graph Expansion in Pruned Recurrent Neural Network Layers Preserve Performance [7.142235510048155]
We prune recurrent networks such as RNNs and LSTMs, maintaining a large spectral gap of the underlying graphs.
We also study the time unfolded recurrent network graphs in terms of the properties of their bipartite layers.
arXiv Detail & Related papers (2024-03-17T06:08:08Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Fast Temporal Wavelet Graph Neural Networks [7.477634824955323]
We propose Fast Temporal Wavelet Graph Neural Networks (FTWGNN) for learning tasks on timeseries data.
We employ Multiresolution Matrix Factorization (MMF) to factorize the highly dense graph structure and compute the corresponding sparse wavelet basis.
Experimental results on real-world PEMS-BAY, METR-LA traffic datasets and AJILE12 ECoG dataset show that FTWGNN is competitive with the state-of-the-arts.
arXiv Detail & Related papers (2023-02-17T01:21:45Z) - NAF: Neural Attenuation Fields for Sparse-View CBCT Reconstruction [79.13750275141139]
This paper proposes a novel and fast self-supervised solution for sparse-view CBCT reconstruction.
The desired attenuation coefficients are represented as a continuous function of 3D spatial coordinates, parameterized by a fully-connected deep neural network.
A learning-based encoder entailing hash coding is adopted to help the network capture high-frequency details.
arXiv Detail & Related papers (2022-09-29T04:06:00Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Learning Frequency-aware Dynamic Network for Efficient Super-Resolution [56.98668484450857]
This paper explores a novel frequency-aware dynamic network for dividing the input into multiple parts according to its coefficients in the discrete cosine transform (DCT) domain.
In practice, the high-frequency part will be processed using expensive operations and the lower-frequency part is assigned with cheap operations to relieve the computation burden.
Experiments conducted on benchmark SISR models and datasets show that the frequency-aware dynamic network can be employed for various SISR neural architectures.
arXiv Detail & Related papers (2021-03-15T12:54:26Z) - RTFN: A Robust Temporal Feature Network for Time Series Classification [9.982074664830867]
Time series data usually contains local and global patterns.
To obtain representations by a feature network is still challenging.
We propose a novel temporal feature network (TFN) and an LSTM-based attention network (LSTMaN)
arXiv Detail & Related papers (2020-11-24T01:24:04Z) - TSAM: Temporal Link Prediction in Directed Networks based on
Self-Attention Mechanism [2.5144068869465994]
We propose a deep learning model based on graph neural networks (GCN) and self-attention mechanism, namely TSAM.
We run comparative experiments on four realistic networks to validate the effectiveness of TSAM.
arXiv Detail & Related papers (2020-08-23T11:56:40Z) - Tensor train decompositions on recurrent networks [60.334946204107446]
Matrix product state (MPS) tensor trains have more attractive features than MPOs, in terms of storage reduction and computing time at inference.
We show that MPS tensor trains should be at the forefront of LSTM network compression through a theoretical analysis and practical experiments on NLP task.
arXiv Detail & Related papers (2020-06-09T18:25:39Z) - Instance Explainable Temporal Network For Multivariate Timeseries [0.0]
We propose a novel network (IETNet) that identifies the important channels in the classification decision for each instance of inference.
IETNet is an end-to-end network that combines temporal feature extraction, variable selection, and joint variable interaction into a single learning framework.
arXiv Detail & Related papers (2020-05-26T20:55:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.