XLACheckpointIO
class pytorch_lightning.plugins.io. XLACheckpointIO [source]
Bases: pytorch_lightning.plugins.io.torch_plugin.TorchCheckpointIO
CheckpointIO that utilizes xm.save()
to save checkpoints for TPU training strategies.
save_checkpoint ( checkpoint , path , storage_options = None ) [source]
Save model/training states as a checkpoint file through state-dump and file-write.
Parameters
checkpoint (Dict
[str
, Any
]) – dict containing model and trainer state
path (Union
[str
, Path
]) – write-target path
storage_options (Optional
[Any
]) – Optional parameters when saving the model/training states.
Return type
None
To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy .