DeepSpeedPrecisionPlugin¶
- class pytorch_lightning.plugins.precision.DeepSpeedPrecisionPlugin(precision, amp_type, amp_level=None)[source]¶
Bases:
pytorch_lightning.plugins.precision.precision_plugin.PrecisionPluginPrecision plugin for DeepSpeed integration.
- Parameters:
precision¶ (
Union[str,int]) – Double precision (64), full precision (32), half precision (16) or bfloat16 precision (bf16).amp_type¶ (
str) – The mixed precision backend to use (“native” or “apex”).amp_level¶ (
Optional[str]) – The optimization level to use (O1, O2, etc…). By default it will be set to “O2” ifamp_typeis set to “apex”.
- Raises:
MisconfigurationException – If using
bfloat16precision anddeepspeed<v0.6.ValueError – If unsupported
precisionis provided.
- backward(tensor, model, optimizer, optimizer_idx, *args, **kwargs)[source]¶
Performs back-propagation using DeepSpeed’s engine.