TorchSyncBatchNorm
- class lightning.pytorch.plugins.TorchSyncBatchNorm[source]
Bases:
lightning.pytorch.plugins.layer_sync.LayerSync
A plugin that wraps all batch normalization layers of a model with synchronization logic for multiprocessing.
This plugin has no effect in single-device operation.
- apply(model)[source]
Add global batchnorm for a model spread across multiple GPUs and nodes.
Override this method to synchronize batchnorm layers between specific process groups instead of the whole world.