MixedPrecision¶
- class lightning_fabric.plugins.precision.MixedPrecision(precision, device, scaler=None)[source]¶
Bases:
lightning_fabric.plugins.precision.precision.PrecisionPlugin for Automatic Mixed Precision (AMP) training with
torch.autocast.- Parameters
precision¶ (
Literal[‘16’, 16, ‘bf16’]) – Whether to usetorch.float16(16) ortorch.bfloat16('bf16').scaler¶ (
Optional[GradScaler]) – An optionaltorch.cuda.amp.GradScalerto use.
- convert_input(data)[source]¶
Convert model inputs (forward) to the floating point precision type of this plugin.
This is a no-op for tensors that are not of floating-point type or already have the desired type.
- Return type
- forward_context()[source]¶
A contextmanager for managing model forward/training_step/evaluation_step/predict_step.