Using deep neural networks to detect non-analytically defined expert event labels in canoe sprint force sensor signals
- URL: http://arxiv.org/abs/2407.08395v1
- Date: Thu, 11 Jul 2024 10:59:11 GMT
- Title: Using deep neural networks to detect non-analytically defined expert event labels in canoe sprint force sensor signals
- Authors: Sarah Rockstrok, Patrick Frenzel, Daniel Matthes, Kay Schubert, David Wollburg, Mirco Fuchs,
- Abstract summary: This paper explores convolutional neural networks (CNNs) and recurrent neural networks (RNNs) in terms of their ability to automatically predict paddle stroke events.
In our results, an RNN based on bidirectional gated recurrent units (BGRUs) turned out to be the most suitable model for paddle stroke detection.
- Score: 3.1446633690603356
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Assessing an athlete's performance in canoe sprint is often established by measuring a variety of kinematic parameters during training sessions. Many of these parameters are related to single or multiple paddle stroke cycles. Determining on- and offset of these cycles in force sensor signals is usually not straightforward and requires human interaction. This paper explores convolutional neural networks (CNNs) and recurrent neural networks (RNNs) in terms of their ability to automatically predict these events. In addition, our work proposes an extension to the recently published SoftED metric for event detection in order to properly assess the model performance on time windows. In our results, an RNN based on bidirectional gated recurrent units (BGRUs) turned out to be the most suitable model for paddle stroke detection.
Related papers
- NIDS Neural Networks Using Sliding Time Window Data Processing with Trainable Activations and its Generalization Capability [0.0]
This paper presents neural networks for network intrusion detection systems (NIDS) that operate on flow data preprocessed with a time window.
It requires only eleven features which do not rely on deep packet inspection and can be found in most NIDS datasets and easily obtained from conventional flow collectors.
The reported training accuracy exceeds 99% for the proposed method with as little as twenty neural network input features.
arXiv Detail & Related papers (2024-10-24T11:36:19Z) - Iteration over event space in time-to-first-spike spiking neural networks for Twitter bot classification [2.578034989438381]
This study proposes a framework that extends existing time-to-first-spike spiking neural network (SNN) models.
We explain spike propagation through a model with multiple input and output spikes at each neuron, as well as design training rules for end-to-end backpropagation.
The model is trained and evaluated on a Twitter bot detection task where the time of events (tweets and retweets) is the primary carrier of information.
arXiv Detail & Related papers (2024-06-03T17:03:14Z) - From "What" to "When" -- a Spiking Neural Network Predicting Rare Events
and Time to their Occurrence [0.0]
This research paper presents a novel approach to learning the corresponding predictive model by an SNN consisting of leaky integrate-and-fire (LIF) neurons.
The proposed method leverages specially designed local synaptic plasticity rules and a novel columnar-layered SNN architecture.
It was demonstrated that the SNN described in this paper gives superior prediction accuracy in comparison with precise machine learning techniques.
arXiv Detail & Related papers (2023-11-09T08:47:23Z) - Sliding Window Neural Generated Tracking Based on Measurement Model [0.0]
This paper explores the efficacy of a feedforward neural network in predicting drones tracks, aiming to eventually, compare the tracks created by the Kalman filter and the ones created by our proposed neural network.
The unique feature of our proposed neural network tracker is that it is using only a measurement model to estimate the next states of the track.
arXiv Detail & Related papers (2023-06-10T12:57:57Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Neuroevolution of a Recurrent Neural Network for Spatial and Working
Memory in a Simulated Robotic Environment [57.91534223695695]
We evolved weights in a biologically plausible recurrent neural network (RNN) using an evolutionary algorithm to replicate the behavior and neural activity observed in rats.
Our method demonstrates how the dynamic activity in evolved RNNs can capture interesting and complex cognitive behavior.
arXiv Detail & Related papers (2021-02-25T02:13:52Z) - Dynamic Time Warping as a New Evaluation for Dst Forecast with Machine
Learning [0.0]
We train a neural network to make a forecast of the disturbance storm time index at origin time $t$ with a forecasting horizon of 1 up to 6 hours.
Inspection of the model's results with the correlation coefficient and RMSE indicated a performance comparable to the latest publications.
A new method is proposed to measure whether two time series are shifted in time with respect to each other.
arXiv Detail & Related papers (2020-06-08T15:14:13Z) - Temporal Pulses Driven Spiking Neural Network for Fast Object
Recognition in Autonomous Driving [65.36115045035903]
We propose an approach to address the object recognition problem directly with raw temporal pulses utilizing the spiking neural network (SNN)
Being evaluated on various datasets, our proposed method has shown comparable performance as the state-of-the-art methods, while achieving remarkable time efficiency.
arXiv Detail & Related papers (2020-01-24T22:58:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.