allennlp.training.scheduler¶
-
class
allennlp.training.scheduler.
Scheduler
(optimizer: torch.optim.optimizer.Optimizer, param_group_field: str, last_epoch: int = -1)[source]¶ Bases:
object
A
Scheduler
is a generalization of PyTorch learning rate schedulers.A scheduler can be used to update any field in an optimizer’s parameter groups, not just the learning rate.
During training using the AllenNLP Trainer, this is the API and calling sequence for
step
andstep_batch
:scheduler = ... # creates scheduler, calls self.step(epoch=-1) in __init__ batch_num_total = 0 for epoch in range(num_epochs): for batch in batchs_in_epoch: # compute loss, update parameters with current learning rates # call step_batch AFTER updating parameters batch_num_total += 1 scheduler.step_batch(batch_num_total) # call step() at the END of each epoch scheduler.step(validation_metrics, epoch)