Shortcuts

XLACheckpointIO

class lightning_fabric.plugins.io.xla.XLACheckpointIO(*args, **kwargs)[source]

Bases: lightning_fabric.plugins.io.torch_io.TorchCheckpointIO

CheckpointIO that utilizes xm.save() to save checkpoints for TPU training strategies.

save_checkpoint(checkpoint, path, storage_options=None)[source]

Save model/training states as a checkpoint file through state-dump and file-write.

Parameters
  • checkpoint (Dict[str, Any]) – dict containing model and trainer state

  • path (Union[str, Path]) – write-target path

  • storage_options (Optional[Any]) – not used in XLACheckpointIO.save_checkpoint

Raises

TypeError – If storage_options arg is passed in

Return type

None


© Copyright Copyright (c) 2018-2023, Lightning AI et al...

Built with Sphinx using a theme provided by Read the Docs.