NativeMixedPrecisionPlugin¶
- class pytorch_lightning.plugins.precision.NativeMixedPrecisionPlugin(precision, device, scaler=None)[source]¶
 Bases:
pytorch_lightning.plugins.precision.mixed.MixedPrecisionPluginPlugin for Native Mixed Precision (AMP) training with
torch.autocast.- Parameters:
 precision¶ (
Union[str,int]) – Whether to usetorch.float16(16) ortorch.bfloat16('bf16').scaler¶ (
Optional[GradScaler]) – An optionaltorch.cuda.amp.GradScalerto use.
- load_state_dict(state_dict)[source]¶
 Called when loading a checkpoint, implement to reload precision plugin state given precision plugin state_dict.
- optimizer_step(model, optimizer, optimizer_idx, closure, **kwargs)[source]¶
 Hook to run the optimizer step.
- Return type: