Skip to content

masked_layer_norm

allennlp.modules.masked_layer_norm

[SOURCE]


MaskedLayerNorm#

class MaskedLayerNorm(torch.nn.Module):
 | def __init__(self, size: int, gamma0: float = 0.1) -> None

See LayerNorm for details.

Note, however, that unlike LayerNorm this norm includes a batch component.

forward#

class MaskedLayerNorm(torch.nn.Module):
 | ...
 | def forward(
 |     self,
 |     tensor: torch.Tensor,
 |     mask: torch.BoolTensor
 | ) -> torch.Tensor