Skip to content

activations

allennlp.nn.activations

[SOURCE]


An Activation is just a function that takes some parameters and returns an element-wise activation function. For the most part we just use PyTorch activations. Here we provide a thin wrapper to allow registering them and instantiating them from_params.

The available activation functions are

Activation#

class Activation(torch.nn.Module,  Registrable)

Pytorch has a number of built-in activation functions. We group those here under a common type, just to make it easier to configure and instantiate them from_params using Registrable.

Note that we're only including element-wise activation functions in this list. You really need to think about masking when you do a softmax or other similar activation function, so it requires a different API.

__call__#

class Activation(torch.nn.Module,  Registrable):
 | ...
 | def __call__(self, tensor: torch.Tensor) -> torch.Tensor

This function is here just to make mypy happy. We expect activation functions to follow this API; the builtin pytorch activation functions follow this just fine, even though they don't subclass Activation. We're just making it explicit here, so mypy knows that activations are callable like this.

Registrable._registry[Activation]#

Registrable._registry[Activation] = {
    "linear": (lambda: _ActivationLambda(lambda x: x, "Linear"), None),  # type: ignore
    "mish" ...