XLAStatsMonitor¶
- class pytorch_lightning.callbacks.XLAStatsMonitor(verbose=True)[source]¶
Bases:
pytorch_lightning.callbacks.base.CallbackDeprecated since version v1.5: The XLAStatsMonitor callback was deprecated in v1.5 and will be removed in v1.7. Please use the DeviceStatsMonitor callback instead.
Automatically monitors and logs XLA stats during training stage.
XLAStatsMonitoris a callback and in order to use it you need to assign a logger in theTrainer.- Parameters
verbose¶ (
bool) – Set toTrueto print average peak and free memory, and epoch time every epoch.- Raises
MisconfigurationException – If not running on TPUs, or
Trainerhas no logger.
Example:
>>> from pytorch_lightning import Trainer >>> from pytorch_lightning.callbacks import XLAStatsMonitor >>> xla_stats = XLAStatsMonitor() >>> trainer = Trainer(callbacks=[xla_stats])
- on_train_epoch_end(trainer, pl_module)[source]¶
Called when the train epoch ends.
To access all batch outputs at the end of the epoch, either: :rtype:
NoneImplement training_epoch_end in the LightningModule and access outputs via the module OR
Cache data across train batch hooks inside the callback implementation to post-process in this hook.