Shortcuts

TPUBf16PrecisionPlugin

class lightning.pytorch.plugins.precision.TPUBf16PrecisionPlugin(*args, **kwargs)[source]

Bases: TPUPrecisionPlugin

Plugin that enables bfloats on TPUs.

connect(model, optimizers, lr_schedulers)[source]

Connects this plugin to the accelerator and the training process.

Return type

Tuple[Module, List[Optimizer], List[Any]]

teardown()[source]

This method is called to teardown the training process.

It is the right place to release memory and free other resources.

Return type

None