TPUAccelerator
class lightning.pytorch.accelerators. TPUAccelerator ( * args , ** kwargs ) [source]
Bases: lightning.pytorch.accelerators.accelerator.Accelerator
Accelerator for TPU devices.
Warning
Use of this accelerator beyond import and instantiation is experimental.
static auto_device_count ( ) [source]
Get the devices when set to auto.
Return type
int
get_device_stats ( device ) [source]
Gets stats for the given TPU device.
Parameters
device (Union
[device
, str
, int
]) – TPU device for which to get stats
Return type
Dict
[str
, Any
]
Returns
A dictionary mapping the metrics (free memory and peak memory) to their values.
static get_parallel_devices ( devices ) [source]
Gets parallel devices for the Accelerator.
Return type
List
[int
]
static is_available ( ) [source]
Detect if the hardware is available.
Return type
bool
static parse_devices ( devices ) [source]
Accelerator device parsing logic.
Return type
Union
[List
[int
], int
, None
]
setup_device ( device ) [source]
Create and prepare the device for the current process.
Return type
None
teardown ( ) [source]
Clean up any state created by the accelerator.
Return type
None
To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy .