Shortcuts

FSDPMixedPrecisionPlugin

class lightning.pytorch.plugins.precision.FSDPMixedPrecisionPlugin(precision, device, scaler=None)[source]

Bases: lightning.pytorch.plugins.precision.amp.MixedPrecisionPlugin

AMP for Fully Sharded Data Parallel (FSDP) Training.

Warning

This is an experimental feature.

clip_grad_by_norm(*_, **__)[source]

Clip gradients by norm.

Return type

None

forward_context()[source]

For FSDP, this context manager is a no-op since conversion is already handled internally.

See: https://pytorch.org/docs/stable/fsdp.html for more details on mixed precision.

Return type

Generator[None, None, None]