Weight normalization is a simple and effective reparameterization technique for speeding up the training of deep neural networks. By normalizing the incoming weights of each layer, weight normalization reduces the dependence of the network on the initialization of the weights and improves generalization. This article explores the benefits and implementation of weight normalization for accelerating deep learning.