TVNet: A Novel Time Series Analysis Method Based on Dynamic Convolution and 3D-Variation
- URL: http://arxiv.org/abs/2503.07674v1
- Date: Mon, 10 Mar 2025 03:30:55 GMT
- Title: TVNet: A Novel Time Series Analysis Method Based on Dynamic Convolution and 3D-Variation
- Authors: Chenghan Li, Mingchen Li, Ruisheng Diao,
- Abstract summary: We introduce a novel time series reshaping technique that considers the inter-patch, intra-patch, and cross-variable dimensions.<n>We propose TVNet, a dynamic convolutional network leveraging a 3D perspective to employ time series analysis.<n>TVNet retains the computational efficiency of CNNs and achieves state-of-the-art results in five key time series analysis tasks.
- Score: 7.332652485849632
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the recent development and advancement of Transformer and MLP architectures, significant strides have been made in time series analysis. Conversely, the performance of Convolutional Neural Networks (CNNs) in time series analysis has fallen short of expectations, diminishing their potential for future applications. Our research aims to enhance the representational capacity of Convolutional Neural Networks (CNNs) in time series analysis by introducing novel perspectives and design innovations. To be specific, We introduce a novel time series reshaping technique that considers the inter-patch, intra-patch, and cross-variable dimensions. Consequently, we propose TVNet, a dynamic convolutional network leveraging a 3D perspective to employ time series analysis. TVNet retains the computational efficiency of CNNs and achieves state-of-the-art results in five key time series analysis tasks, offering a superior balance of efficiency and performance over the state-of-the-art Transformer-based and MLP-based models. Additionally, our findings suggest that TVNet exhibits enhanced transferability and robustness. Therefore, it provides a new perspective for applying CNN in advanced time series analysis tasks.
Related papers
- Exploring Neural Network Pruning with Screening Methods [3.443622476405787]
Modern deep learning models have tens of millions of parameters which makes the inference processes resource-intensive.<n>This paper proposes and evaluates a network pruning framework that eliminates non-essential parameters.<n>The proposed framework produces competitive lean networks compared to the original networks.
arXiv Detail & Related papers (2025-02-11T02:31:04Z) - VidFormer: A novel end-to-end framework fused by 3DCNN and Transformer for Video-based Remote Physiological Measurement [9.605944796068046]
We introduce VidFormer, a novel framework that integrates convolutional networks (CNNs) and models for r tasks.
Our evaluation on five publicly available datasets demonstrates that VidFormer outperforms current state-of-the-art (SOTA) methods.
arXiv Detail & Related papers (2025-01-03T08:18:08Z) - Task-Oriented Real-time Visual Inference for IoVT Systems: A Co-design Framework of Neural Networks and Edge Deployment [61.20689382879937]
Task-oriented edge computing addresses this by shifting data analysis to the edge.
Existing methods struggle to balance high model performance with low resource consumption.
We propose a novel co-design framework to optimize neural network architecture.
arXiv Detail & Related papers (2024-10-29T19:02:54Z) - TCCT-Net: Two-Stream Network Architecture for Fast and Efficient Engagement Estimation via Behavioral Feature Signals [58.865901821451295]
We present a novel two-stream feature fusion "Tensor-Convolution and Convolution-Transformer Network" (TCCT-Net) architecture.
To better learn the meaningful patterns in the temporal-spatial domain, we design a "CT" stream that integrates a hybrid convolutional-transformer.
In parallel, to efficiently extract rich patterns from the temporal-frequency domain, we introduce a "TC" stream that uses Continuous Wavelet Transform (CWT) to represent information in a 2D tensor form.
arXiv Detail & Related papers (2024-04-15T06:01:48Z) - Incorporating Taylor Series and Recursive Structure in Neural Networks
for Time Series Prediction [0.29008108937701327]
Time series analysis is relevant in various disciplines such as physics, biology, chemistry, and finance.
We present a novel neural network architecture that integrates elements from ResNet structures, while introducing the innovative Taylor series framework.
arXiv Detail & Related papers (2024-02-09T14:34:28Z) - WinNet: Make Only One Convolutional Layer Effective for Time Series Forecasting [11.232780368635416]
We present a highly accurate and simply structured CNN-based model with only one convolutional layer, called WinNet.
Results demonstrate that WinNet can achieve SOTA performance and lower complexity over CNN-based methods.
arXiv Detail & Related papers (2023-11-01T01:23:59Z) - A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection [98.41798478488101]
Time series analytics is crucial to unlocking the wealth of information implicit in available data.
Recent advancements in graph neural networks (GNNs) have led to a surge in GNN-based approaches for time series analysis.
This survey brings together a vast array of knowledge on GNN-based time series research, highlighting foundations, practical applications, and opportunities of graph neural networks for time series analysis.
arXiv Detail & Related papers (2023-07-07T08:05:03Z) - Efficient Online Processing with Deep Neural Networks [1.90365714903665]
This dissertation is dedicated to the neural network efficiency. Specifically, a core contribution addresses the efficiency aspects during online inference.
These advances are attained through a bottomup computational reorganization and judicious architectural modifications.
arXiv Detail & Related papers (2023-06-23T12:29:44Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - Convolutional Neural Network Dynamics: A Graph Perspective [39.81881710355496]
We take a graph perspective and investigate the relationship between the graph structure of NNs and their performance.
For the dynamic graph representation of NNs, we explore structural representations for fully-connected and convolutional layers.
Our analysis shows that a simple summary of graph statistics can be used to accurately predict the performance of NNs.
arXiv Detail & Related papers (2021-11-09T20:38:48Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.