HPUPrecisionPlugin
class pytorch_lightning.plugins.precision. HPUPrecisionPlugin ( precision , opt_level = 'O2' , bf16_file_path = None , fp32_file_path = None , verbose = False ) [source]
Bases: pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin
Plugin that enables bfloat/half support on HPUs.
Parameters
precision (Union
[str
, int
]) – The precision to use.
opt_level (str
) – Choose optimization level for hmp.
bf16_file_path (Optional
[str
]) – Path to bf16 ops list in hmp O1 mode.
fp32_file_path (Optional
[str
]) – Path to fp32 ops list in hmp O1 mode.
verbose (bool
) – Enable verbose mode for hmp.
To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy .