SingleTPUStrategy
class pytorch_lightning.strategies. SingleTPUStrategy ( device , accelerator = None , checkpoint_io = None , precision_plugin = None , debug = False ) [source]
Bases: pytorch_lightning.strategies.single_device.SingleDeviceStrategy
Strategy for training on a single TPU device.
setup ( trainer ) [source]
Setup plugins for the trainer fit and creates optimizers.
Parameters:
trainer (Trainer
) – the trainer instance
Return type:
None
teardown ( ) [source]
This method is called to teardown the training process.
It is the right place to release memory and free other resources.
Return type:
None
To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy .