maxout
allennlp.modules.maxout
A maxout neural network.
Maxout¶
class Maxout(torch.nn.Module, FromParams):
| def __init__(
| self,
| input_dim: int,
| num_layers: int,
| output_dims: Union[int, Sequence[int]],
| pool_sizes: Union[int, Sequence[int]],
| dropout: Union[float, Sequence[float]] = 0.0
| ) -> None
This Module
is a maxout neural network.
Parameters¶
- input_dim :
int
The dimensionality of the input. We assume the input has shape(batch_size, input_dim)
. - num_layers :
int
The number of maxout layers to apply to the input. - output_dims :
Union[int, Sequence[int]]
The output dimension of each of the maxout layers. If this is a singleint
, we use it for all maxout layers. If it is aSequence[int]
,len(output_dims)
must benum_layers
. - pool_sizes :
Union[int, Sequence[int]]
The size of max-pools. If this is a singleint
, we use it for all maxout layers. If it is aSequence[int]
,len(pool_sizes)
must benum_layers
. - dropout :
Union[float, Sequence[float]]
, optional (default =0.0
)
If given, we will apply this amount of dropout after each layer. Semantics offloat
versusSequence[float]
is the same as with other parameters.
get_output_dim¶
class Maxout(torch.nn.Module, FromParams):
| ...
| def get_output_dim(self)
get_input_dim¶
class Maxout(torch.nn.Module, FromParams):
| ...
| def get_input_dim(self)
forward¶
class Maxout(torch.nn.Module, FromParams):
| ...
| def forward(self, inputs: torch.Tensor) -> torch.Tensor