Skip to content




NamedParameter = Tuple[str, torch.Tensor]


class MovingAverage(Registrable):
 | def __init__(self, parameters: Iterable[NamedParameter]) -> None

Tracks a moving average of model parameters.


class MovingAverage(Registrable):
 | ...
 | default_implementation = "exponential"


class MovingAverage(Registrable):
 | ...
 | def apply(self, num_updates: Optional[int] = None)

Update the moving averages based on the latest values of the parameters.


class MovingAverage(Registrable):
 | ...
 | def assign_average_value(self) -> None

Replace all the parameter values with the averages. Save the current parameter values to restore later.


class MovingAverage(Registrable):
 | ...
 | def restore(self) -> None

Restore the backed-up (non-average) parameter values.


class MovingAverage(Registrable):
 | ...
 | def state_dict(self) -> Dict[str, Any]


class MovingAverage(Registrable):
 | ...
 | def load_state_dict(self, state_dict: Dict[str, Any]) -> None


class ExponentialMovingAverage(MovingAverage):
 | def __init__(
 |     self,
 |     parameters: Iterable[NamedParameter],
 |     decay: float = 0.9999,
 |     numerator: float = 1.0,
 |     denominator: float = 10.0
 | ) -> None

Create shadow variables and maintain exponential moving average for model parameters.

Registered as a MovingAverage with name "exponential".


  • parameters : Iterable[Tuple[str, Parameter]]
    The parameters whose averages we'll be tracking. In a typical AllenNLP configuration file, this argument does not get an entry under the "moving_average", it gets passed in separately.
  • decay : float, optional (default = 0.9999)
    The decay rate that will be used if num_updates is not passed (and that will be used as an upper bound if num_updates is passed).
  • numerator : float, optional (default = 1.0)
    The numerator used to compute the decay rate if num_updates is passed.
  • denominator : float, optional (default = 10.0)
    The denominator used to compute the decay rate if num_updates is passed.


class ExponentialMovingAverage(MovingAverage):
 | ...
 | def apply(self, num_updates: Optional[int] = None) -> None

Apply exponential moving average to named_parameters if specified, or we will apply this to all the trainable parameters of the model.

The optional num_updates parameter allows one to tweak the decay rate dynamically. If passed, the actual decay rate used is:

`min(decay, (numerator + num_updates) / (denominator + num_updates))`

(This logic is based on the Tensorflow exponential moving average