TorchSyncBatchNorm
class lightning.pytorch.plugins. TorchSyncBatchNorm [source]
Bases: LayerSync
A plugin that wraps all batch normalization layers of a model with synchronization logic for multiprocessing.
This plugin has no effect in single-device operation.
apply ( model ) [source]
Add global batchnorm for a model spread across multiple GPUs and nodes.
Override this method to synchronize batchnorm layers between specific process groups instead
of the whole world.
Parameters:
model (Module
) – Reference to the current LightningModule
Return type:
Module
Returns:
LightningModule with batchnorm layers synchronized within the process groups.
revert ( model ) [source]
Convert the wrapped batchnorm layers back to regular batchnorm layers.
Parameters:
model (Module
) – Reference to the current LightningModule
Return type:
Module
Returns:
LightningModule with regular batchnorm layers that will no longer sync across processes.
To analyze traffic and optimize your experience, we serve cookies on this
site. By clicking or navigating, you agree to allow our usage of cookies.
Read PyTorch Lightning's
Privacy Policy .