NativeMixedPrecisionPlugin¶
- class pytorch_lightning.plugins.precision.NativeMixedPrecisionPlugin(precision, device, scaler=None)[source]¶
- Bases: - pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin- Plugin for Native Mixed Precision (AMP) training with - torch.autocast.- Parameters:
- precision¶ ( - Union[- str,- int]) – Whether to use- torch.float16(- 16) or- torch.bfloat16(- 'bf16').
- scaler¶ ( - Optional[- GradScaler]) – An optional- torch.cuda.amp.GradScalerto use.
 
 - clip_gradients(optimizer, clip_val=0.0, gradient_clip_algorithm=GradClipAlgorithmType.NORM)[source]¶
- Clips the gradients. - Return type:
 
 - load_state_dict(state_dict)[source]¶
- Called when loading a checkpoint, implement to reload precision plugin state given precision plugin state_dict. 
 - optimizer_step(optimizer, model, optimizer_idx, closure, **kwargs)[source]¶
- Hook to run the optimizer step. - Return type: