DeviceStatsMonitor¶
- class pytorch_lightning.callbacks.DeviceStatsMonitor(cpu_stats=None)[source]¶
Bases:
pytorch_lightning.callbacks.callback.CallbackAutomatically monitors and logs device stats during training stage.
DeviceStatsMonitoris a special callback as it requires aloggerto passed as argument to theTrainer.- Parameters:
cpu_stats¶ (
Optional[bool]) – ifNone, it will log CPU stats only if the accelerator is CPU. It will raise a warning ifpsutilis not installed till v1.9.0. IfTrue, it will log CPU stats regardless of the accelerator, and it will raise an exception ifpsutilis not installed. IfFalse, it will not log CPU stats regardless of the accelerator.- Raises:
MisconfigurationException – If
Trainerhas no logger.
Example
>>> from pytorch_lightning import Trainer >>> from pytorch_lightning.callbacks import DeviceStatsMonitor >>> device_stats = DeviceStatsMonitor() >>> trainer = Trainer(callbacks=[device_stats])
- on_train_batch_end(trainer, pl_module, outputs, batch, batch_idx)[source]¶
Called when the train batch ends. :rtype:
NoneNote
The value
outputs["loss"]here will be the normalized value w.r.taccumulate_grad_batchesof the loss returned fromtraining_step.