Time Regularization in Optimal Time Variable Learning
- URL: http://arxiv.org/abs/2306.16111v2
- Date: Wed, 6 Dec 2023 10:52:15 GMT
- Title: Time Regularization in Optimal Time Variable Learning
- Authors: Evelyn Herberg and Roland Herzog and Frederik K\"ohne
- Abstract summary: Recently, optimal time variable learning in deep neural networks (DNNs) was introduced in arXiv:2204.08528.
We extend the concept by introducing a regularization term that directly relates to the time horizon in discrete dynamical systems.
We propose an adaptive pruning approach for Residual Neural Networks (ResNets)
Results are illustrated by applying the proposed concepts to classification tasks on the well known MNIST and Fashion MNIST data sets.
- Score: 0.4490343701046724
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Recently, optimal time variable learning in deep neural networks (DNNs) was
introduced in arXiv:2204.08528. In this manuscript we extend the concept by
introducing a regularization term that directly relates to the time horizon in
discrete dynamical systems. Furthermore, we propose an adaptive pruning
approach for Residual Neural Networks (ResNets), which reduces network
complexity without compromising expressiveness, while simultaneously decreasing
training time. The results are illustrated by applying the proposed concepts to
classification tasks on the well known MNIST and Fashion MNIST data sets. Our
PyTorch code is available on
https://github.com/frederikkoehne/time_variable_learning.
Related papers
- Time-Parameterized Convolutional Neural Networks for Irregularly Sampled
Time Series [26.77596449192451]
Irregularly sampled time series are ubiquitous in several application domains, leading to sparse, not fully-observed and non-aligned observations.
Standard sequential neural networks (RNNs) and convolutional neural networks (CNNs) consider regular spacing between observation times, posing significant challenges to irregular time series modeling.
We parameterize convolutional layers by employing time-explicitly irregular kernels.
arXiv Detail & Related papers (2023-08-06T21:10:30Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Task-Synchronized Recurrent Neural Networks [0.0]
Recurrent Neural Networks (RNNs) traditionally involve ignoring the fact, feeding the time differences as additional inputs, or resampling the data.
We propose an elegant straightforward alternative approach where instead the RNN is in effect resampled in time to match the time of the data or the task at hand.
We confirm empirically that our models can effectively compensate for the time-non-uniformity of the data and demonstrate that they compare favorably to data resampling, classical RNN methods, and alternative RNN models.
arXiv Detail & Related papers (2022-04-11T15:27:40Z) - AdaS: Adaptive Scheduling of Stochastic Gradients [50.80697760166045]
We introduce the notions of textit"knowledge gain" and textit"mapping condition" and propose a new algorithm called Adaptive Scheduling (AdaS)
Experimentation reveals that, using the derived metrics, AdaS exhibits: (a) faster convergence and superior generalization over existing adaptive learning methods; and (b) lack of dependence on a validation set to determine when to stop training.
arXiv Detail & Related papers (2020-06-11T16:36:31Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Time Dependence in Non-Autonomous Neural ODEs [74.78386661760662]
We propose a novel family of Neural ODEs with time-varying weights.
We outperform previous Neural ODE variants in both speed and representational capacity.
arXiv Detail & Related papers (2020-05-05T01:41:46Z) - Time Series Data Augmentation for Neural Networks by Time Warping with a
Discriminative Teacher [17.20906062729132]
We propose a novel time series data augmentation called guided warping.
guided warping exploits the element alignment properties of Dynamic Time Warping (DTW) and shapeDTW.
We evaluate the method on all 85 datasets in the 2015 UCR Time Series Archive with a deep convolutional neural network (CNN) and a recurrent neural network (RNN)
arXiv Detail & Related papers (2020-04-19T06:33:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.