Wavelet Probabilistic Recurrent Convolutional Network for Multivariate Time Series Classification
- URL: http://arxiv.org/abs/2505.17307v1
- Date: Thu, 22 May 2025 21:57:50 GMT
- Title: Wavelet Probabilistic Recurrent Convolutional Network for Multivariate Time Series Classification
- Authors: Pu Yang, J. A. Barria,
- Abstract summary: We introduce a versatile wavelet probabilistic module designed to extract and analyse the probabilistic features.<n>The module can seamlessly integrate with a variety of neural network architectures.<n>The WPRCN is evaluated on 30 diverse MTS datasets and outperforms all the benchmark algorithms on average accuracy and rank.
- Score: 0.36832029288386137
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a Wavelet Probabilistic Recurrent Convolutional Network (WPRCN) for Multivariate Time Series Classification (MTSC), especially effective in handling non-stationary environments, data scarcity and noise perturbations. We introduce a versatile wavelet probabilistic module designed to extract and analyse the probabilistic features, which can seamlessly integrate with a variety of neural network architectures. This probabilistic module comprises an Adaptive Wavelet Probabilistic Feature Generator (AWPG) and a Channel Attention-based Probabilistic Temporal Convolutional Network (APTCN). Such formulation extends the application of wavelet probabilistic neural networks to deep neural networks for MTSC. The AWPG constructs an ensemble probabilistic model addressing different data scarcities and non-stationarity; it adaptively selects the optimal ones and generates probabilistic features for APTCN. The APTCN analyses the correlations of the features and forms a comprehensive feature space with existing MTSC models for classification. Here, we instantiate the proposed module to work in parallel with a Long Short-Term Memory (LSTM) network and a Causal Fully Convolutional Network (C-FCN), demonstrating its broad applicability in time series analysis. The WPRCN is evaluated on 30 diverse MTS datasets and outperforms all the benchmark algorithms on average accuracy and rank, exhibiting pronounced strength in handling scarce data and physiological data subject to perturbations and non-stationarities.
Related papers
- Anomaly Detection for Non-stationary Time Series using Recurrent Wavelet Probabilistic Neural Network [0.36832029288386137]
Unsupervised Recurrent Wavelet Probabilistic Neural Network (RWPNN) is proposed.<n>It aims at detecting anomalies in non-stationary environments by modelling the temporal features using a non density estimation network.
arXiv Detail & Related papers (2025-05-16T14:43:00Z) - Bayesian Reasoning Enabled by Spin-Orbit Torque Magnetic Tunnel Junctions [7.081096702778852]
We present proof-of-concept experiments demonstrating the use of spin-orbit torque magnetic tunnel junctions (SOT-MTJs) in Bayesian network reasoning.<n>The parameters of the network can also approach the optimum through a simple point-by-point training algorithm.<n>We developed a simple medical diagnostic system using the SOT-MTJ as a random number generator and sampler.
arXiv Detail & Related papers (2025-04-11T05:02:27Z) - TS-LIF: A Temporal Segment Spiking Neuron Network for Time Series Forecasting [27.91825785119938]
Spiking Neural Networks (SNNs) offer a promising, biologically inspired approach for processing data for time series forecasting.<n>We introduce the Temporal Leaky Segment Integrate-and-Fire model, featuring a dual-compartment architecture.<n> Experimental results show that TS-LIF outperforms traditional SNNs in time series forecasting.
arXiv Detail & Related papers (2025-03-07T03:06:21Z) - Compact Bayesian Neural Networks via pruned MCMC sampling [0.16777183511743468]
Bayesian Neural Networks (BNNs) offer robust uncertainty quantification in model predictions, but training them presents a significant computational challenge.<n>In this study, we address some of the challenges by leveraging MCMC sampling with network pruning to obtain compact probabilistic models.<n>We ensure that the compact BNN retains its ability to estimate uncertainty via the posterior distribution while retaining the model training and generalisation performance accuracy by adapting post-pruning resampling.
arXiv Detail & Related papers (2025-01-12T22:48:04Z) - Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
Neuromorphic computing uses spiking neural networks (SNNs) to perform inference tasks.<n> embedding a small payload within each spike exchanged between spiking neurons can enhance inference accuracy without increasing energy consumption.<n> split computing - where an SNN is partitioned across two devices - is a promising solution.<n>This paper presents the first comprehensive study of a neuromorphic wireless split computing architecture that employs multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - MCDFN: Supply Chain Demand Forecasting via an Explainable Multi-Channel Data Fusion Network Model [0.0]
We introduce the Multi-Channel Data Fusion Network (MCDFN), a hybrid architecture that integrates CNN, Long Short-Term Memory networks (LSTM), and Gated Recurrent Units (GRU)<n>Our comparative benchmarking demonstrates that MCDFN outperforms seven other deep-learning models.<n>This research advances demand forecasting methodologies and offers practical guidelines for integrating MCDFN into supply chain systems.
arXiv Detail & Related papers (2024-05-24T14:30:00Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Rate Distortion Characteristic Modeling for Neural Image Compression [59.25700168404325]
End-to-end optimization capability offers neural image compression (NIC) superior lossy compression performance.
distinct models are required to be trained to reach different points in the rate-distortion (R-D) space.
We make efforts to formulate the essential mathematical functions to describe the R-D behavior of NIC using deep network and statistical modeling.
arXiv Detail & Related papers (2021-06-24T12:23:05Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.