allennlp.modules.highway

A Highway layer that does a gated combination of a linear transformation and a non-linear transformation of its input.

class allennlp.modules.highway.Highway(input_dim: int, num_layers: int = 1, activation: Callable[[torch.Tensor], torch.Tensor] = <function relu>)[source]

Bases: torch.nn.modules.module.Module

A Highway layer does a gated combination of a linear transformation and a non-linear transformation of its input. \(y = g * x + (1 - g) * f(A(x))\), where \(A\) is a linear transformation, \(f\) is an element-wise non-linearity, and \(g\) is an element-wise gate, computed as \(sigmoid(B(x))\).

This module will apply a fixed number of highway layers to its input, returning the final result.

Parameters
input_dimint

The dimensionality of \(x\). We assume the input has shape (batch_size, ..., input_dim).

num_layersint, optional (default=``1``)

The number of highway layers to apply to the input.

activationCallable[[torch.Tensor], torch.Tensor], optional (default=``torch.nn.functional.relu``)

The non-linearity to use in the highway layers.

forward(self, inputs: torch.Tensor) → torch.Tensor[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.