Recurrent Neural Network Regularization
Venue
Google Inc. (2014)
Publication Year
2014
Authors
Wojciech Zaremba, Ilya Sutskever, Oriol Vinyals
BibTeX
Abstract
We present a simple regularization technique for Recurrent Neural Networks (RNNs)
with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique
for regularizing neural networks, does not work well with RNNs and LSTMs. In this
paper, we show how to correctly apply dropout to LSTMs, and show that it
substantially reduces overfitting on a variety of tasks. These tasks include
language modeling, speech recognition, image caption generation, and machine
translation.
