Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series
- URL: http://arxiv.org/abs/2308.03210v2
- Date: Wed, 9 Aug 2023 06:39:29 GMT
- Title: Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series
- Authors: Chrysoula Kosma, Giannis Nikolentzos, Michalis Vazirgiannis
- Abstract summary: Irregularly sampled time series are ubiquitous in several application domains, leading to sparse, not fully-observed and non-aligned observations.
Standard sequential neural networks (RNNs) and convolutional neural networks (CNNs) consider regular spacing between observation times, posing significant challenges to irregular time series modeling.
We parameterize convolutional layers by employing time-explicitly irregular kernels.
- Score: 26.77596449192451
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Irregularly sampled multivariate time series are ubiquitous in several
application domains, leading to sparse, not fully-observed and non-aligned
observations across different variables. Standard sequential neural network
architectures, such as recurrent neural networks (RNNs) and convolutional
neural networks (CNNs), consider regular spacing between observation times,
posing significant challenges to irregular time series modeling. While most of
the proposed architectures incorporate RNN variants to handle irregular time
intervals, convolutional neural networks have not been adequately studied in
the irregular sampling setting. In this paper, we parameterize convolutional
layers by employing time-explicitly initialized kernels. Such general functions
of time enhance the learning process of continuous-time hidden dynamics and can
be efficiently incorporated into convolutional kernel weights. We, thus,
propose the time-parameterized convolutional neural network (TPCNN), which
shares similar properties with vanilla convolutions but is carefully designed
for irregularly sampled time series. We evaluate TPCNN on both interpolation
and classification tasks involving real-world irregularly sampled multivariate
time series datasets. Our experimental results indicate the competitive
performance of the proposed TPCNN model which is also significantly more
efficient than other state-of-the-art methods. At the same time, the proposed
architecture allows the interpretability of the input series by leveraging the
combination of learnable time functions that improve the network performance in
subsequent tasks and expedite the inaugural application of convolutions in this
field.
Related papers
- A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting [1.9950682531209158]
We provide an approach to link time series characteristics with RNN components via the versatile metric of distance correlation.
We empirically show that the RNN activation layers learn the lag structures of time series well.
We also show that the activation layers cannot adequately model moving average and heteroskedastic time series processes.
arXiv Detail & Related papers (2023-07-28T22:32:08Z) - Time Regularization in Optimal Time Variable Learning [0.4490343701046724]
Recently, optimal time variable learning in deep neural networks (DNNs) was introduced in arXiv:2204.08528.
We extend the concept by introducing a regularization term that directly relates to the time horizon in discrete dynamical systems.
We propose an adaptive pruning approach for Residual Neural Networks (ResNets)
Results are illustrated by applying the proposed concepts to classification tasks on the well known MNIST and Fashion MNIST data sets.
arXiv Detail & Related papers (2023-06-28T11:27:48Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Task-Synchronized Recurrent Neural Networks [0.0]
Recurrent Neural Networks (RNNs) traditionally involve ignoring the fact, feeding the time differences as additional inputs, or resampling the data.
We propose an elegant straightforward alternative approach where instead the RNN is in effect resampled in time to match the time of the data or the task at hand.
We confirm empirically that our models can effectively compensate for the time-non-uniformity of the data and demonstrate that they compare favorably to data resampling, classical RNN methods, and alternative RNN models.
arXiv Detail & Related papers (2022-04-11T15:27:40Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Oscillatory Fourier Neural Network: A Compact and Efficient Architecture
for Sequential Processing [16.69710555668727]
We propose a novel neuron model that has cosine activation with a time varying component for sequential processing.
The proposed neuron provides an efficient building block for projecting sequential inputs into spectral domain.
Applying the proposed model to sentiment analysis on IMDB dataset reaches 89.4% test accuracy within 5 epochs.
arXiv Detail & Related papers (2021-09-14T19:08:07Z) - CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep
Representation Learning from Sporadic Temporal Data [1.8352113484137622]
In this paper, a novel deep learning-based model is developed for modeling multiple temporal features in sporadic data.
The proposed model, called CARRNN, uses a generalized discrete-time autoregressive model that is trainable end-to-end using neural networks modulated by time lags.
It is applied to multivariate time-series regression tasks using data provided for Alzheimer's disease progression modeling and intensive care unit (ICU) mortality rate prediction.
arXiv Detail & Related papers (2021-04-08T12:43:44Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.