Abstract: This paper introduces a novel method to fine-tune handwriting recognition
systems based on Recurrent Neural Networks (RNN). Long Short-Term Memory (LSTM)
networks are good at modeling long sequences but they tend to overfit over
time. To improve the system's ability to model sequences, we propose to drop
information at random positions in the sequence. We call our approach Temporal
Dropout (TD). We apply TD at the image level as well to internal network
representation. We show that TD improves the results on two different datasets.
Our method outperforms previous state-of-the-art on Rodrigo dataset.