Gated Recurrent Units (GRU) — Improving RNNs

Explaining how Gated Recurrent Neural Networks work

10 min read

2 hours ago

https://www.flaticon.com/free-icons/neural-network” title=”neural network icons”>Neural network icons created by juicy_fish — Flaticon.

In this article, I will explore a standard implementation of recurrent neural networks (RNNs): gated recurrent units (GRUs).

GRUs were introduced in 2014 by Kyunghyun Cho et al. and are an improvement from vanilla RNNs as they suffer less from the vanishing gradient problem…