PrecisionPlugin¶
- class lightning.pytorch.plugins.precision.PrecisionPlugin[source]¶
- Bases: - lightning.fabric.plugins.precision.precision.Precision,- lightning.pytorch.core.hooks.CheckpointHooks- Base class for all plugins handling the precision-specific parts of the training. - The class attribute precision must be overwritten in child classes. The default value reflects fp32 training. - backward(tensor, model, optimizer, *args, **kwargs)[source]¶
- Performs the actual backpropagation. - Parameters
- model¶ ( - LightningModule) – the model to be optimized
- optimizer¶ ( - Optional[- Steppable]) – current optimizer being used.- Noneif using manual optimization
- *args¶ – Positional arguments intended for the actual function that performs the backward, like - backward().
- **kwargs¶ – Keyword arguments for the same purpose as - *args.
 
- Return type
 
 - clip_gradients(optimizer, clip_val=0.0, gradient_clip_algorithm=GradClipAlgorithmType.NORM)[source]¶
- Clips the gradients. - Return type
 
 - connect(model, optimizers, lr_schedulers)[source]¶
- Connects this plugin to the accelerator and the training process. 
 - optimizer_step(optimizer, model, closure, **kwargs)[source]¶
- Hook to run the optimizer step. - Return type
 
 - post_backward(tensor, module)[source]¶
- Runs after precision plugin executes backward. - Parameters
- tensor¶ ( - Tensor) – The tensor that will be used for backpropagation
- module¶ ( - LightningModule) – The module that was involved in producing the tensor and whose parameters need the gradients
 
- Return type
 
 - pre_backward(tensor, module)[source]¶
- Runs before precision plugin executes backward. - Parameters
- tensor¶ ( - Tensor) – The tensor that will be used for backpropagation
- module¶ ( - LightningModule) – The module that was involved in producing the tensor and whose parameters need the gradients
 
- Return type