allennlp.nn.activations¶
An Activation
is just a function
that takes some parameters and returns an element-wise activation function.
For the most part we just use
PyTorch activations.
Here we provide a thin wrapper to allow registering them and instantiating them from_params
.
The available activation functions are
“linear”
-
class
allennlp.nn.activations.
Activation
[source]¶ Bases:
allennlp.common.registrable.Registrable
Pytorch has a number of built-in activation functions. We group those here under a common type, just to make it easier to configure and instantiate them
from_params
usingRegistrable
.Note that we’re only including element-wise activation functions in this list. You really need to think about masking when you do a softmax or other similar activation function, so it requires a different API.