distributed =========== .. currentmodule:: pytorch_lightning.utilities.distributed .. rubric:: Functions .. autosummary:: :nosignatures: all_gather_ddp_if_available distributed_available gather_all_tensors get_default_process_group_backend_for_device init_dist_connection rank_zero_debug rank_zero_info register_ddp_comm_hook sync_ddp sync_ddp_if_available tpu_distributed .. rubric:: Classes .. autosummary:: :nosignatures: AllGatherGrad .. automodule:: pytorch_lightning.utilities.distributed