Feature Selection for Multivariate Time Series via Network Pruning
- URL: http://arxiv.org/abs/2102.06024v1
- Date: Thu, 11 Feb 2021 14:33:39 GMT
- Title: Feature Selection for Multivariate Time Series via Network Pruning
- Authors: Kang Gu, Soroush Vosoughi, Temiloluwa Prioleau
- Abstract summary: We propose a novel neural component, namely Neural Feature Se-lector (NFS), as an end-2-end solution for feature selection in MTS data.
NFS is based on convolution design and includes two modules: firstly each feature stream within MTS is processed by a temporal CNN independently.
We evaluated the proposed NFS model on four real-world MTS datasets and found that it achieves comparable results with state-of-the-art methods.
- Score: 2.2559617939136505
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, there has been an ever increasing amount of multivariate
time series (MTS) data in various domains, typically generated by a large
family of sensors such as wearable devices. This has led to the development of
novel learning methods on MTS data, with deep learning models dominating the
most recent advancements. Prior literature has primarily focused on designing
new network architectures for modeling temporal dependencies within MTS.
However, a less studied challenge is associated with high dimensionality of MTS
data. In this paper, we propose a novel neural component, namely Neural Feature
Se-lector (NFS), as an end-2-end solution for feature selection in MTS data.
Specifically, NFS is based on decomposed convolution design and includes two
modules: firstly each feature stream within MTS is processed by a temporal CNN
independently; then an aggregating CNN combines the processed streams to
produce input for other downstream networks. We evaluated the proposed NFS
model on four real-world MTS datasets and found that it achieves comparable
results with state-of-the-art methods while providing the benefit of feature
selection. Our paper also highlights the robustness and effectiveness of
feature selection with NFS compared to using recent autoencoder-based methods.
Related papers
- Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
In neuromorphic computing, spiking neural networks (SNNs) perform inference tasks, offering significant efficiency gains for workloads involving sequential data.
Recent advances in hardware and software have demonstrated that embedding a few bits of payload in each spike exchanged between the spiking neurons can further enhance inference accuracy.
This paper investigates a wireless neuromorphic split computing architecture employing multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - Unveiling the Power of Sparse Neural Networks for Feature Selection [60.50319755984697]
Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection.
We show that SNNs trained with dynamic sparse training (DST) algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
arXiv Detail & Related papers (2024-08-08T16:48:33Z) - A hybrid IndRNNLSTM approach for real-time anomaly detection in
software-defined networks [0.0]
Anomaly detection in SDN using data flow prediction is a difficult task.
IndRNNLSTM algorithm, in combination with Embedded, was able to achieve MAE=1.22 and RMSE=9.92 on NSL-KDD data.
arXiv Detail & Related papers (2024-02-02T20:41:55Z) - Few-shot Learning using Data Augmentation and Time-Frequency
Transformation for Time Series Classification [6.830148185797109]
We propose a novel few-shot learning framework through data augmentation.
We also develop a sequence-spectrogram neural network (SSNN)
Our methodology demonstrates its applicability of addressing the few-shot problems for time series classification.
arXiv Detail & Related papers (2023-11-06T15:32:50Z) - MTS2Graph: Interpretable Multivariate Time Series Classification with
Temporal Evolving Graphs [1.1756822700775666]
We introduce a new framework for interpreting time series data by extracting and clustering the input representative patterns.
We run experiments on eight datasets of the UCR/UEA archive, along with HAR and PAM datasets.
arXiv Detail & Related papers (2023-06-06T16:24:27Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Contextual HyperNetworks for Novel Feature Adaptation [43.49619456740745]
Contextual HyperNetwork (CHN) generates parameters for extending the base model to a new feature.
At prediction time, the CHN requires only a single forward pass through a neural network, yielding a significant speed-up.
We show that this system obtains improved few-shot learning performance for novel features over existing imputation and meta-learning baselines.
arXiv Detail & Related papers (2021-04-12T23:19:49Z) - FMA-ETA: Estimating Travel Time Entirely Based on FFN With Attention [88.33372574562824]
We propose a novel framework based on feed-forward network (FFN) for ETA, FFN with Multi-factor self-Attention (FMA-ETA)
The novel Multi-factor self-attention mechanism is proposed to deal with different category features and aggregate the information purposefully.
Experiments show FMA-ETA is competitive with state-of-the-art methods in terms of the prediction accuracy with significantly better inference speed.
arXiv Detail & Related papers (2020-06-07T08:10:47Z) - Instance Explainable Temporal Network For Multivariate Timeseries [0.0]
We propose a novel network (IETNet) that identifies the important channels in the classification decision for each instance of inference.
IETNet is an end-to-end network that combines temporal feature extraction, variable selection, and joint variable interaction into a single learning framework.
arXiv Detail & Related papers (2020-05-26T20:55:24Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.