allennlp.modules.scalar_mix¶
-
class
allennlp.modules.scalar_mix.ScalarMix(mixture_size: int, do_layer_norm: bool = False, initial_scalar_parameters: List[float] = None, trainable: bool = True)[source]¶ Bases:
torch.nn.modules.module.ModuleComputes a parameterised scalar mixture of N tensors,
mixture = gamma * sum(s_k * tensor_k)wheres = softmax(w), withwandgammascalar parameters.In addition, if
do_layer_norm=Truethen apply layer normalization to each tensor before weighting.-
forward(self, tensors: List[torch.Tensor], mask: torch.Tensor = None) → torch.Tensor[source]¶ Compute a weighted average of the
tensors. The input tensors an be any shape with at least two dimensions, but must all be the same shape.When
do_layer_norm=True, themaskis required input. If thetensorsare dimensioned(dim_0, ..., dim_{n-1}, dim_n), then themaskis dimensioned(dim_0, ..., dim_{n-1}), as in the typical case withtensorsof shape(batch_size, timesteps, dim)andmaskof shape(batch_size, timesteps).When
do_layer_norm=Falsethemaskis ignored.
-