layer_norm
allennlp.modules.layer_norm
LayerNorm¶
class LayerNorm(torch.nn.Module):
| def __init__(self, dimension: int) -> None
An implementation of Layer Normalization.
Layer Normalization stabilises the training of deep neural networks by normalising the outputs of neurons from a particular layer. It computes:
output = (gamma * (tensor - mean) / (std + eps)) + beta
Parameters¶
- dimension :
int
The dimension of the layer output to normalize.
Returns¶
- The normalized layer output.
forward¶
class LayerNorm(torch.nn.Module):
| ...
| def forward(self, tensor: torch.Tensor)