HPUPrecisionPlugin
- class pytorch_lightning.plugins.precision.HPUPrecisionPlugin(precision, opt_level='O2', bf16_file_path=None, fp32_file_path=None, verbose=False)[source]
Bases:
pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin
Plugin that enables bfloat/half support on HPUs.
- Parameters
precision (
Union
[Literal
[32, 16],Literal
[‘32’, ‘16’, ‘bf16’]]) – The precision to use.opt_level (
str
) – Choose optimization level for hmp.bf16_file_path (
Optional
[str
]) – Path to bf16 ops list in hmp O1 mode.fp32_file_path (
Optional
[str
]) – Path to fp32 ops list in hmp O1 mode.verbose (
bool
) – Enable verbose mode for hmp.