Shortcuts

CheckpointIO

class lightning.pytorch.plugins.io.CheckpointIO[source]

Bases: abc.ABC

Interface to save/load checkpoints as they are saved through the Strategy.

Warning

This is an experimental feature.

Typically most plugins either use the Torch based IO Plugin; TorchCheckpointIO but may require particular handling depending on the plugin.

In addition, you can pass a custom CheckpointIO by extending this class and passing it to the Trainer, i.e Trainer(plugins=[MyCustomCheckpointIO()]).

Note

For some plugins, it is not possible to use a custom checkpoint plugin as checkpointing logic is not modifiable.

abstract load_checkpoint(path, map_location=None)[source]

Load checkpoint from a path when resuming or loading ckpt for test/validate/predict stages.

Parameters

Returns: The loaded checkpoint.

Return type

Dict[str, Any]

abstract remove_checkpoint(path)[source]

Remove checkpoint file from the filesystem.

Parameters

path (Union[str, Path]) – Path to checkpoint

Return type

None

abstract save_checkpoint(checkpoint, path, storage_options=None)[source]

Save model/training states as a checkpoint file through state-dump and file-write.

Parameters
  • checkpoint (Dict[str, Any]) – dict containing model and trainer state

  • path (Union[str, Path]) – write-target path

  • storage_options (Optional[Any]) – Optional parameters when saving the model/training states.

Return type

None

teardown()[source]

This method is called to teardown the process.

Return type

None