allennlp.modules.layer_norm#

LayerNorm#

LayerNorm(self, dimension:int) -> None

An implementation of Layer Normalization.

Layer Normalization stabilises the training of deep neural networks by normalising the outputs of neurons from a particular layer. It computes:

output = (gamma * (tensor - mean) / (std + eps)) + beta

Parameters

  • dimension : int, required. The dimension of the layer output to normalize.

Returns

The normalized layer output.