Shortcuts

TPUBf16PrecisionPlugin

class pytorch_lightning.plugins.precision.TPUBf16PrecisionPlugin[source]

Bases: pytorch_lightning.plugins.precision.tpu.TPUPrecisionPlugin

Plugin that enables bfloats on TPUs.

connect(model, optimizers, lr_schedulers)[source]

Connects this plugin to the accelerator and the training process.

Return type

Tuple[Module, List[Optimizer], List[Any]]

teardown()[source]

This method is called to teardown the training process.

It is the right place to release memory and free other resources.

Return type

None

You are viewing an outdated version of PyTorch Lightning Docs

Click here to view the latest version→