Shortcuts

TPUBf16Precision

class lightning_fabric.plugins.precision.TPUBf16Precision[source]

Bases: lightning_fabric.plugins.precision.tpu.TPUPrecision

Plugin that enables bfloats on TPUs.

convert_input(data)[source]

Convert model inputs (forward) to the floating point precision type of this plugin.

This is a no-op for tensors that are not of floating-point type or already have the desired type.

Return type

Tensor

teardown()[source]

This method is called to teardown the training process.

It is the right place to release memory and free other resources.

Return type

None


© Copyright Copyright (c) 2018-2023, Lightning AI et al...

Built with Sphinx using a theme provided by Read the Docs.