Shortcuts

TPUAccelerator

class pytorch_lightning.accelerators.TPUAccelerator[source]

Bases: pytorch_lightning.accelerators.accelerator.Accelerator

Accelerator for TPU devices.

static auto_device_count()[source]

Get the devices when set to auto.

Return type:

int

get_device_stats(device)[source]

Gets stats for the given TPU device.

Parameters:

device (Union[str, device]) – TPU device for which to get stats

Return type:

Dict[str, Any]

Returns:

A dictionary mapping the metrics (free memory and peak memory) to their values.

static get_parallel_devices(devices)[source]

Gets parallel devices for the Accelerator.

Return type:

List[int]

static is_available()[source]

Detect if the hardware is available.

Return type:

bool

static parse_devices(devices)[source]

Accelerator device parsing logic.

Return type:

Union[List[int], int, None]