scheduler
allennlp.training.scheduler
Scheduler#
class Scheduler:
| def __init__(
| self,
| optimizer: torch.optim.Optimizer,
| param_group_field: str,
| last_epoch: int = -1
| ) -> None
A Scheduler
is a generalization of PyTorch learning rate schedulers.
A scheduler can be used to update any field in an optimizer's parameter groups, not just the learning rate.
During training using the AllenNLP Trainer
, this is the API and calling
sequence for step
and step_batch
::
scheduler = ... # creates scheduler, calls self.step(last_epoch=-1) in init
batch_num_total = 0 for epoch in range(num_epochs): for batch in batchs_in_epoch: # compute loss, update parameters with current learning rates # call step_batch AFTER updating parameters batch_num_total += 1 scheduler.step_batch(batch_num_total) # call step() at the END of each epoch scheduler.step(validation_metrics, epoch)
state_dict#
class Scheduler:
| ...
| def state_dict(self) -> Dict[str, Any]
Returns the state of the scheduler as a dict
.
load_state_dict#
class Scheduler:
| ...
| def load_state_dict(self, state_dict: Dict[str, Any]) -> None
Load the schedulers state.
Parameters
- state_dict :
Dict[str, Any]
Scheduler state. Should be an object returned from a call tostate_dict
.
get_values#
class Scheduler:
| ...
| def get_values(self)
step#
class Scheduler:
| ...
| def step(self, metric: float = None) -> None
step_batch#
class Scheduler:
| ...
| def step_batch(self, batch_num_total: int = None) -> None
By default, a scheduler is assumed to only update every epoch, not every batch. So this does nothing unless it's overriden.