class torch.optim.optimizer.Optimizer, last_epoch: int = -1)[source]

Bases:, allennlp.common.registrable.Registrable

classmethod from_params(optimizer: torch.optim.optimizer.Optimizer, params: allennlp.common.params.Params)[source]

This is the automatic implementation of from_params. Any class that subclasses FromParams (or Registrable, which itself subclasses FromParams) gets this implementation for free. If you want your class to be instantiated from params in the “obvious” way – pop off parameters and hand them to your constructor with the same names – this provides that functionality.

If you need more complex logic in your from from_params method, you’ll have to implement your own method that overrides this one.

get_values(self) → None[source]
class torch.optim.optimizer.Optimizer, cool_down: int, warm_up: int, ratio: int = 10, last_epoch: int = -1)[source]


Adjust momentum during training according to an inverted triangle-like schedule.

The momentum starts off high, then decreases linearly for cool_down epochs, until reaching 1 / ratio th of the original value. Then the momentum increases linearly for warm_up epochs until reaching its original value again. If there are still more epochs left over to train, the momentum will stay flat at the original value.