[ allennlp.modules.highway ]
A Highway layer that does a gated combination of a linear transformation and a non-linear transformation of its input.
class Highway(torch.nn.Module): | def __init__( | self, | input_dim: int, | num_layers: int = 1, | activation: Callable[[torch.Tensor], torch.Tensor] = torch.nn.functional.relu | ) -> None
A Highway layer does a gated combination of a linear
transformation and a non-linear transformation of its input. :math:
y = g * x + (1 - g) *
f(A(x)), where :math:
A is a linear transformation, :math:
f is an element-wise
non-linearity, and :math:
g is an element-wise gate, computed as :math:
This module will apply a fixed number of highway layers to its input, returning the final result.
- input_dim :
The dimensionality of :math:
x. We assume the input has shape
(batch_size, ..., input_dim).
- num_layers :
int, optional (default =
The number of highway layers to apply to the input.
- activation :
Callable[[torch.Tensor], torch.Tensor], optional (default =
The non-linearity to use in the highway layers.
| @overrides | def forward(self, inputs: torch.Tensor) -> torch.Tensor