Explaining how Gated Recurrent Neural Networks work
In this article, I will explore a standard implementation of recurrent neural networks (RNNs): gated recurrent units (GRUs).
GRUs were introduced in 2014 by Kyunghyun Cho et al. and are an improvement from vanilla RNNs as they suffer less from the vanishing gradient problem…