Shortcuts

HPUAccelerator

class lightning.pytorch.accelerators.HPUAccelerator[source]

Bases: Accelerator

Accelerator for HPU devices.

Warning

Use of this accelerator beyond import and instantiation is experimental.

static auto_device_count()[source]

Returns the number of HPU devices when the devices is set to auto.

Return type

int

static get_device_name()[source]

Returns the name of the HPU device.

Return type

str

get_device_stats(device)[source]

Returns a map of the following metrics with their values:

  • Limit: amount of total memory on HPU device.

  • InUse: amount of allocated memory at any instance.

  • MaxInUse: amount of total active memory allocated.

  • NumAllocs: number of allocations.

  • NumFrees: number of freed chunks.

  • ActiveAllocs: number of active allocations.

  • MaxAllocSize: maximum allocated size.

  • TotalSystemAllocs: total number of system allocations.

  • TotalSystemFrees: total number of system frees.

  • TotalActiveAllocs: total number of active allocations.

Return type

Dict[str, Any]

static get_parallel_devices(devices)[source]

Gets parallel devices for the Accelerator.

Return type

List[device]

static is_available()[source]

Returns a bool indicating if HPU is currently available.

Return type

bool

static parse_devices(devices)[source]

Accelerator device parsing logic.

Return type

Optional[int]

setup_device(device)[source]
Raises

MisconfigurationException – If the selected device is not HPU.

Return type

None

teardown()[source]

Clean up any state created by the accelerator.

Return type

None