ModelIO¶
- class pytorch_lightning.core.saving.ModelIO[source]¶
Bases:
object- classmethod load_from_checkpoint(checkpoint_path, map_location=None, hparams_file=None, strict=True, **kwargs)[source]¶
Primary way of loading a model from a checkpoint. When Lightning saves a checkpoint it stores the arguments passed to
__init__in the checkpoint under"hyper_parameters".Any arguments specified through **kwargs will override args stored in
"hyper_parameters".- Parameters:
checkpoint_path¶ (
Union[str,Path,IO]) – Path to checkpoint. This can also be a URL, or file-like objectmap_location¶ (
Union[device,str,int,Callable[[Union[device,str,int]],Union[device,str,int]],Dict[Union[device,str,int],Union[device,str,int]],None]) – If your checkpoint saved a GPU model and you now load on CPUs or a different number of GPUs, use this to map to the new setup. The behaviour is the same as intorch.load().hparams_file¶ (
Union[str,Path,None]) –Optional path to a
.yamlor.csvfile with hierarchical structure as in this example:drop_prob: 0.2 dataloader: batch_size: 32
You most likely won’t need this since Lightning will always save the hyperparameters to the checkpoint. However, if your checkpoint weights don’t have the hyperparameters saved, use this method to pass in a
.yamlfile with the hparams you’d like to use. These will be converted into adictand passed into yourLightningModulefor use.If your model’s
hparamsargument isNamespaceand.yamlfile has hierarchical structure, you need to refactor your model to treathparamsasdict.strict¶ (
bool) – Whether to strictly enforce that the keys incheckpoint_pathmatch the keys returned by this module’s state dict.**kwargs¶ (
Any) – Any extra keyword args needed to init the model. Can also be used to override saved hyperparameter values.
- Return type:
Self- Returns:
LightningModuleinstance with loaded weights and hyperparameters (if available).
Note
load_from_checkpointis a class method. You should use yourLightningModuleclass to call it instead of theLightningModuleinstance.Example:
# load weights without mapping ... model = MyLightningModule.load_from_checkpoint('path/to/checkpoint.ckpt') # or load weights mapping all weights from GPU 1 to GPU 0 ... map_location = {'cuda:1':'cuda:0'} model = MyLightningModule.load_from_checkpoint( 'path/to/checkpoint.ckpt', map_location=map_location ) # or load weights and hyperparameters from separate files. model = MyLightningModule.load_from_checkpoint( 'path/to/checkpoint.ckpt', hparams_file='/path/to/hparams_file.yaml' ) # override some of the params with new values model = MyLightningModule.load_from_checkpoint( PATH, num_layers=128, pretrained_ckpt_path=NEW_PATH, ) # predict pretrained_model.eval() pretrained_model.freeze() y_hat = pretrained_model(x)