Shortcuts

ColossalAIPrecisionPlugin

class pytorch_lightning.plugins.precision.ColossalAIPrecisionPlugin(precision=16)[source]

Bases: pytorch_lightning.plugins.precision.precision_plugin.PrecisionPlugin

Precision plugin for ColossalAI integration.

Parameters:

precision (Literal['16', 16]) – Half precision (16).

Raises:

ValueError – If precison is not 16.

backward(tensor, model, optimizer, optimizer_idx, *args, **kwargs)[source]

Performs the actual backpropagation.

Parameters:
  • tensor (Tensor) – the loss value obtained from the closure

  • model (LightningModule) – the model to be optimized

  • optimizer (Optional[Steppable]) – current optimizer being used. None if using manual optimization

  • optimizer_idx (Optional[int]) – the index of the current optimizer. None if using manual optimization

  • *args (Any) – Positional arguments intended for the actual function that performs the backward, like backward().

  • **kwargs (Any) – Keyword arguments for the same purpose as *args.

Return type:

None

clip_grad_by_norm(optimizer, clip_val)[source]

Clip gradients by norm.

Return type:

None

clip_grad_by_value(optimizer, clip_val)[source]

Clip gradients by value.

Return type:

None

optimizer_step(optimizer, model, optimizer_idx, closure, **kwargs)[source]

Hook to run the optimizer step.

Return type:

Any