Shortcuts

HPUPrecisionPlugin

class lightning.pytorch.plugins.precision.HPUPrecisionPlugin(precision, opt_level='O2', bf16_file_path=None, fp32_file_path=None, verbose=False)[source]

Bases: lightning.pytorch.plugins.precision.precision_plugin.PrecisionPlugin

Plugin that enables bfloat/half support on HPUs.

Warning

This is an experimental feature.

Parameters
  • precision (Literal[‘32-true’, ‘16-mixed’, ‘bf16-mixed’]) – The precision to use.

  • opt_level (str) – Choose optimization level for hmp.

  • bf16_file_path (Optional[str]) – Path to bf16 ops list in hmp O1 mode.

  • fp32_file_path (Optional[str]) – Path to fp32 ops list in hmp O1 mode.

  • verbose (bool) – Enable verbose mode for hmp.