lr_monitor¶
Classes
Automatically monitor and logs learning rate for learning rate schedulers during training. |
Learning Rate Monitor¶
Monitor and logs learning rate for lr schedulers during training.
- class pytorch_lightning.callbacks.lr_monitor.LearningRateMonitor(logging_interval=None, log_momentum=False)[source]¶
Bases:
pytorch_lightning.callbacks.base.CallbackAutomatically monitor and logs learning rate for learning rate schedulers during training.
- Parameters
logging_interval¶ (
Optional[str]) – set to'epoch'or'step'to loglrof all optimizers at the same interval, set toNoneto log at individual interval according to theintervalkey of each scheduler. Defaults toNone.log_momentum¶ (
bool) – option to also log the momentum values of the optimizer, if the optimizer has themomentumorbetasattribute. Defaults toFalse.
- Raises
MisconfigurationException – If
logging_intervalis none of"step","epoch", orNone.
Example:
>>> from pytorch_lightning import Trainer >>> from pytorch_lightning.callbacks import LearningRateMonitor >>> lr_monitor = LearningRateMonitor(logging_interval='step') >>> trainer = Trainer(callbacks=[lr_monitor])
Logging names are automatically determined based on optimizer class name. In case of multiple optimizers of same type, they will be named
Adam,Adam-1etc. If a optimizer has multiple parameter groups they will be namedAdam/pg1,Adam/pg2etc. To control naming, pass in anamekeyword in the construction of the learning rate schedulers. Anamekeyword can also be used for parameter groups in the construction of the optimizer.Example:
def configure_optimizer(self): optimizer = torch.optim.Adam(...) lr_scheduler = { 'scheduler': torch.optim.lr_scheduler.LambdaLR(optimizer, ...) 'name': 'my_logging_name' } return [optimizer], [lr_scheduler]
Example:
def configure_optimizer(self): optimizer = torch.optim.SGD( [{ 'params': [p for p in self.parameters()], 'name': 'my_parameter_group_name' }], lr=0.1 ) lr_scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, ...) return [optimizer], [lr_scheduler]
- on_train_batch_start(trainer, *args, **kwargs)[source]¶
Called when the train batch begins.
- Return type