Shortcuts

TorchSyncBatchNorm

class lightning.pytorch.plugins.TorchSyncBatchNorm[source]

Bases: lightning.pytorch.plugins.layer_sync.LayerSync

A plugin that wraps all batch normalization layers of a model with synchronization logic for multiprocessing.

This plugin has no effect in single-device operation.

apply(model)[source]

Add global batchnorm for a model spread across multiple GPUs and nodes.

Override this method to synchronize batchnorm layers between specific process groups instead of the whole world.

Parameters

model (Module) – Reference to the current LightningModule

Return type

Module

Returns

LightningModule with batchnorm layers synchronized within the process groups.

revert(model)[source]

Convert the wrapped batchnorm layers back to regular batchnorm layers.

Parameters

model (Module) – Reference to the current LightningModule

Return type

Module

Returns

LightningModule with regular batchnorm layers that will no longer sync across processes.