Skip to content

masked_layer_norm

[ allennlp.modules.masked_layer_norm ]


MaskedLayerNorm Objects#

class MaskedLayerNorm(torch.nn.Module):
 | def __init__(self, size: int, gamma0: float = 0.1) -> None

See LayerNorm for details.

Note, however, that unlike LayerNorm this norm includes a batch component.

forward#

 | def forward(
 |     self,
 |     tensor: torch.Tensor,
 |     mask: torch.BoolTensor
 | ) -> torch.Tensor